Dec 04 17:26:27 crc systemd[1]: Starting Kubernetes Kubelet... Dec 04 17:26:27 crc restorecon[4764]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:27 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 17:26:28 crc restorecon[4764]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 04 17:26:28 crc kubenswrapper[4948]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 17:26:28 crc kubenswrapper[4948]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 17:26:28 crc kubenswrapper[4948]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 17:26:28 crc kubenswrapper[4948]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 17:26:28 crc kubenswrapper[4948]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 17:26:28 crc kubenswrapper[4948]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.746368 4948 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749042 4948 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749070 4948 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749076 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749080 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749084 4948 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749088 4948 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749093 4948 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749097 4948 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749102 4948 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749106 4948 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749109 4948 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749113 4948 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749117 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749120 4948 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749124 4948 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749127 4948 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749131 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749134 4948 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749138 4948 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749149 4948 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749153 4948 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749157 4948 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749160 4948 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749164 4948 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749168 4948 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749171 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749175 4948 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749179 4948 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749182 4948 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749186 4948 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749189 4948 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749193 4948 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749196 4948 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749200 4948 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749203 4948 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749207 4948 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749210 4948 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749214 4948 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749218 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749221 4948 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749226 4948 feature_gate.go:330] unrecognized feature gate: Example Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749230 4948 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749233 4948 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749237 4948 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749240 4948 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749243 4948 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749247 4948 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749250 4948 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749253 4948 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749257 4948 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749260 4948 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749264 4948 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749267 4948 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749270 4948 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749276 4948 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749281 4948 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749285 4948 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749289 4948 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749293 4948 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749297 4948 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749301 4948 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749304 4948 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749309 4948 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749314 4948 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749318 4948 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749324 4948 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749328 4948 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749332 4948 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749337 4948 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749341 4948 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.749345 4948 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749414 4948 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749422 4948 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749430 4948 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749435 4948 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749440 4948 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749444 4948 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749450 4948 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749455 4948 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749460 4948 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749463 4948 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749468 4948 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749472 4948 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749476 4948 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749481 4948 flags.go:64] FLAG: --cgroup-root="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749484 4948 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749488 4948 flags.go:64] FLAG: --client-ca-file="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749492 4948 flags.go:64] FLAG: --cloud-config="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749496 4948 flags.go:64] FLAG: --cloud-provider="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749500 4948 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749505 4948 flags.go:64] FLAG: --cluster-domain="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749509 4948 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749515 4948 flags.go:64] FLAG: --config-dir="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749519 4948 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749523 4948 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749529 4948 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749534 4948 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749539 4948 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749544 4948 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749553 4948 flags.go:64] FLAG: --contention-profiling="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749561 4948 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749566 4948 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749571 4948 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749576 4948 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749583 4948 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749589 4948 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749594 4948 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749599 4948 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749605 4948 flags.go:64] FLAG: --enable-server="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749609 4948 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749618 4948 flags.go:64] FLAG: --event-burst="100" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749623 4948 flags.go:64] FLAG: --event-qps="50" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749628 4948 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749633 4948 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749638 4948 flags.go:64] FLAG: --eviction-hard="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749644 4948 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749654 4948 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749661 4948 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749667 4948 flags.go:64] FLAG: --eviction-soft="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749672 4948 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749677 4948 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749682 4948 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749687 4948 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749692 4948 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749697 4948 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749702 4948 flags.go:64] FLAG: --feature-gates="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749709 4948 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749716 4948 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749721 4948 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749725 4948 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749731 4948 flags.go:64] FLAG: --healthz-port="10248" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749735 4948 flags.go:64] FLAG: --help="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749740 4948 flags.go:64] FLAG: --hostname-override="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749744 4948 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749748 4948 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749752 4948 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749757 4948 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749761 4948 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749765 4948 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749769 4948 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749773 4948 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749777 4948 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749782 4948 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749786 4948 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749791 4948 flags.go:64] FLAG: --kube-reserved="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749795 4948 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749799 4948 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749805 4948 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749809 4948 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749813 4948 flags.go:64] FLAG: --lock-file="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749816 4948 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749821 4948 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749825 4948 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749831 4948 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749834 4948 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749838 4948 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749842 4948 flags.go:64] FLAG: --logging-format="text" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749847 4948 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749851 4948 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749855 4948 flags.go:64] FLAG: --manifest-url="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749859 4948 flags.go:64] FLAG: --manifest-url-header="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749864 4948 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749869 4948 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749874 4948 flags.go:64] FLAG: --max-pods="110" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749878 4948 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749882 4948 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749886 4948 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749890 4948 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749895 4948 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749899 4948 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749903 4948 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749913 4948 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749917 4948 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749921 4948 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749925 4948 flags.go:64] FLAG: --pod-cidr="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749929 4948 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749935 4948 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749939 4948 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749944 4948 flags.go:64] FLAG: --pods-per-core="0" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749948 4948 flags.go:64] FLAG: --port="10250" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749952 4948 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749956 4948 flags.go:64] FLAG: --provider-id="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749960 4948 flags.go:64] FLAG: --qos-reserved="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749964 4948 flags.go:64] FLAG: --read-only-port="10255" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749968 4948 flags.go:64] FLAG: --register-node="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749972 4948 flags.go:64] FLAG: --register-schedulable="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749976 4948 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749983 4948 flags.go:64] FLAG: --registry-burst="10" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749987 4948 flags.go:64] FLAG: --registry-qps="5" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749991 4948 flags.go:64] FLAG: --reserved-cpus="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.749995 4948 flags.go:64] FLAG: --reserved-memory="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750001 4948 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750005 4948 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750010 4948 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750013 4948 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750017 4948 flags.go:64] FLAG: --runonce="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750021 4948 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750025 4948 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750030 4948 flags.go:64] FLAG: --seccomp-default="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750034 4948 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750042 4948 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750063 4948 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750068 4948 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750073 4948 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750077 4948 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750081 4948 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750085 4948 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750089 4948 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750093 4948 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750097 4948 flags.go:64] FLAG: --system-cgroups="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750101 4948 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750108 4948 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750111 4948 flags.go:64] FLAG: --tls-cert-file="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750115 4948 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750120 4948 flags.go:64] FLAG: --tls-min-version="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750125 4948 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750129 4948 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750133 4948 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750137 4948 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750141 4948 flags.go:64] FLAG: --v="2" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750148 4948 flags.go:64] FLAG: --version="false" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750153 4948 flags.go:64] FLAG: --vmodule="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750158 4948 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750162 4948 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750264 4948 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750269 4948 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750273 4948 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750277 4948 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750281 4948 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750285 4948 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750289 4948 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750292 4948 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750296 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750300 4948 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750306 4948 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750313 4948 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750318 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750322 4948 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750326 4948 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750330 4948 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750334 4948 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750338 4948 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750342 4948 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750345 4948 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750349 4948 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750353 4948 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750358 4948 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750361 4948 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750365 4948 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750369 4948 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750373 4948 feature_gate.go:330] unrecognized feature gate: Example Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750376 4948 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750380 4948 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750384 4948 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750387 4948 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750391 4948 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750394 4948 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750398 4948 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750401 4948 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750405 4948 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750408 4948 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750412 4948 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750415 4948 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750418 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750422 4948 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750426 4948 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750429 4948 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750433 4948 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750436 4948 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750440 4948 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750444 4948 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750448 4948 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750451 4948 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750455 4948 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750458 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750462 4948 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750465 4948 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750469 4948 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750472 4948 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750475 4948 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750479 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750483 4948 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750492 4948 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750497 4948 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750501 4948 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750505 4948 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750508 4948 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750514 4948 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750517 4948 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750522 4948 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750526 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750530 4948 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750533 4948 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750537 4948 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.750541 4948 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.750547 4948 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.758113 4948 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.758159 4948 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758320 4948 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758334 4948 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758342 4948 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758346 4948 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758351 4948 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758355 4948 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758361 4948 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758367 4948 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758372 4948 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758376 4948 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758381 4948 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758385 4948 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758395 4948 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758399 4948 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758403 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758407 4948 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758411 4948 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758415 4948 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758419 4948 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758424 4948 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758428 4948 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758431 4948 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758435 4948 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758439 4948 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758442 4948 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758449 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758453 4948 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758457 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758461 4948 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758465 4948 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758469 4948 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758473 4948 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758478 4948 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758483 4948 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758487 4948 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758491 4948 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758495 4948 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758503 4948 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758509 4948 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758515 4948 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758519 4948 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758524 4948 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758528 4948 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758532 4948 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758536 4948 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758540 4948 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758544 4948 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758548 4948 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758552 4948 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758560 4948 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758565 4948 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758570 4948 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758575 4948 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758579 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758584 4948 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758587 4948 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758592 4948 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758595 4948 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758599 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758603 4948 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758607 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758611 4948 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758619 4948 feature_gate.go:330] unrecognized feature gate: Example Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758622 4948 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758626 4948 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758631 4948 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758635 4948 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758640 4948 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758645 4948 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758649 4948 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758654 4948 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.758662 4948 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758871 4948 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758881 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758887 4948 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758891 4948 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758896 4948 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758901 4948 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758909 4948 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758913 4948 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758917 4948 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758921 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758925 4948 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758928 4948 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758932 4948 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758936 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758940 4948 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758944 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758949 4948 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758952 4948 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758956 4948 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758963 4948 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758968 4948 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758972 4948 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758981 4948 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758985 4948 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758989 4948 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758994 4948 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.758998 4948 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759083 4948 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759131 4948 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759141 4948 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759151 4948 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759160 4948 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759170 4948 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759221 4948 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759467 4948 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759481 4948 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759489 4948 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759498 4948 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759506 4948 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759515 4948 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759524 4948 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759536 4948 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759547 4948 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759556 4948 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759565 4948 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759573 4948 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759582 4948 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759591 4948 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759601 4948 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759609 4948 feature_gate.go:330] unrecognized feature gate: Example Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759619 4948 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759630 4948 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759641 4948 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759653 4948 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759663 4948 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759674 4948 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759685 4948 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759693 4948 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759701 4948 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759709 4948 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759719 4948 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759727 4948 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759736 4948 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759743 4948 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759751 4948 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759762 4948 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759772 4948 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759780 4948 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759788 4948 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759797 4948 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.759805 4948 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.759819 4948 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.760116 4948 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.765556 4948 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.765750 4948 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.767485 4948 server.go:997] "Starting client certificate rotation" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.767531 4948 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.767713 4948 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-29 08:57:39.477238214 +0000 UTC Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.767804 4948 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 591h31m10.709437185s for next certificate rotation Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.776594 4948 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.779449 4948 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.787911 4948 log.go:25] "Validated CRI v1 runtime API" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.805479 4948 log.go:25] "Validated CRI v1 image API" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.806827 4948 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.809950 4948 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-04-17-21-58-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.809983 4948 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.827169 4948 manager.go:217] Machine: {Timestamp:2025-12-04 17:26:28.825620246 +0000 UTC m=+0.186694678 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a01eebc5-0a1d-4af1-abcb-67984481a255 BootID:2c510825-90fc-4792-8c77-c6294ad916fc Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e3:ea:60 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e3:ea:60 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ef:3a:17 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:38:64:2c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:19:aa:93 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:fa:f8:26 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:fc:7f:9a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3e:fe:67:ca:2c:81 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:e4:72:a5:1e:6e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.827468 4948 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.827824 4948 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.828818 4948 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.829023 4948 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.829090 4948 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.829356 4948 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.829370 4948 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.829683 4948 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.829727 4948 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.829963 4948 state_mem.go:36] "Initialized new in-memory state store" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.830334 4948 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.832717 4948 kubelet.go:418] "Attempting to sync node with API server" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.832751 4948 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.832816 4948 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.832834 4948 kubelet.go:324] "Adding apiserver pod source" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.832849 4948 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.838166 4948 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.838648 4948 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.839395 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.839476 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.839574 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.839602 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.839786 4948 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840721 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840779 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840802 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840821 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840887 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840904 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840920 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840946 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840966 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.840992 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.841026 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.841043 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.841785 4948 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.842675 4948 server.go:1280] "Started kubelet" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.843227 4948 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.843495 4948 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.843484 4948 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.844108 4948 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 17:26:28 crc systemd[1]: Started Kubernetes Kubelet. Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.845367 4948 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e132d33fedae7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 17:26:28.842609383 +0000 UTC m=+0.203683855,LastTimestamp:2025-12-04 17:26:28.842609383 +0000 UTC m=+0.203683855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.846577 4948 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.846637 4948 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.846982 4948 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.847014 4948 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.847172 4948 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.846988 4948 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.847568 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.847645 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.847610 4948 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 10:49:58.775550621 +0000 UTC Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.847692 4948 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 953h23m29.927866573s for next certificate rotation Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.847886 4948 factory.go:55] Registering systemd factory Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.847908 4948 factory.go:221] Registration of the systemd container factory successfully Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.848542 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="200ms" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.848579 4948 factory.go:153] Registering CRI-O factory Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.848603 4948 factory.go:221] Registration of the crio container factory successfully Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.849624 4948 server.go:460] "Adding debug handlers to kubelet server" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.849743 4948 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.849813 4948 factory.go:103] Registering Raw factory Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.849857 4948 manager.go:1196] Started watching for new ooms in manager Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.852279 4948 manager.go:319] Starting recovery of all containers Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870655 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870745 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870757 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870768 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870777 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870791 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870803 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870832 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870864 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870875 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870890 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870901 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870910 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870925 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870938 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870953 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870966 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870978 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870988 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.870999 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871008 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871017 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871026 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871035 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871063 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871073 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871099 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871109 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871140 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871169 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871179 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871191 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871201 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871213 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871223 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871251 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871263 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871275 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871286 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871297 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871308 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871318 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871347 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871357 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871369 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871381 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871392 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871405 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871416 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871427 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871438 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871450 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871467 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871479 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871512 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871525 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871536 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.871572 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873396 4948 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873442 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873459 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873470 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873481 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873493 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873504 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873516 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873527 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873537 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873548 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873559 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873570 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873581 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873594 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873605 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873616 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873627 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873643 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873655 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873666 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873676 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873686 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873696 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873709 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873719 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873736 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873747 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873756 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873769 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873780 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873795 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873807 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873816 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873828 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873840 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873851 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873863 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873873 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873884 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873895 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873904 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873914 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873930 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873941 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873951 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873960 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873978 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.873990 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874003 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874017 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874029 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874040 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874086 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874096 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874107 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874117 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874127 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874138 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874147 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874158 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874167 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874178 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874189 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874198 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874208 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874218 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874227 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874236 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874244 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874254 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874263 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874273 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874284 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874294 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874304 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874316 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874329 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874339 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874349 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874361 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874371 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874387 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874401 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874411 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874423 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874432 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874443 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874456 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874467 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874481 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874491 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874504 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874516 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874526 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874536 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874546 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874555 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874568 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874579 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874589 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874600 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874611 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874621 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874632 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874643 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874653 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874664 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874674 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874684 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874695 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874706 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874719 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874730 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874743 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874755 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874767 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874781 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874821 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874832 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874843 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874854 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874864 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874873 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874884 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874894 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874905 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874917 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874927 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874941 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874952 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874963 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874975 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874986 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.874996 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875006 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875017 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875029 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875044 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875072 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875086 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875102 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875117 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875132 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875145 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875156 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875168 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875180 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875190 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875201 4948 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875213 4948 reconstruct.go:97] "Volume reconstruction finished" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.875221 4948 reconciler.go:26] "Reconciler: start to sync state" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.891311 4948 manager.go:324] Recovery completed Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.906400 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.908129 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.908157 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.908168 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.909445 4948 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.909632 4948 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.909690 4948 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.909742 4948 state_mem.go:36] "Initialized new in-memory state store" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.912391 4948 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.912437 4948 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 17:26:28 crc kubenswrapper[4948]: I1204 17:26:28.912472 4948 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.912510 4948 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 04 17:26:28 crc kubenswrapper[4948]: W1204 17:26:28.913620 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.913739 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:28 crc kubenswrapper[4948]: E1204 17:26:28.947600 4948 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.012710 4948 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.048274 4948 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.050166 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="400ms" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.148791 4948 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.213313 4948 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.249276 4948 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.255370 4948 policy_none.go:49] "None policy: Start" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.257333 4948 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.257420 4948 state_mem.go:35] "Initializing new in-memory state store" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.307400 4948 manager.go:334] "Starting Device Plugin manager" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.308350 4948 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.308400 4948 server.go:79] "Starting device plugin registration server" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.309249 4948 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.309351 4948 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.310773 4948 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.310934 4948 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.310954 4948 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.315923 4948 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.410602 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.412089 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.412122 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.412134 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.412161 4948 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.412758 4948 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.451082 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="800ms" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.613619 4948 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.613815 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.615313 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.615345 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.615354 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.618804 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.618826 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.619216 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.619305 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622471 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622518 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622540 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622483 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622543 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622574 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622563 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622657 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622593 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622786 4948 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.622929 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.623799 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.623889 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.624292 4948 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.624639 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.624671 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.624687 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.624946 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.625072 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.625132 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.625442 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.625465 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.625474 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626309 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626332 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626342 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626341 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626443 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626464 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626500 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626644 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.626692 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.627558 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.627594 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.627611 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.627673 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.627699 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.627716 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.627776 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.627807 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.628648 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.628693 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.628710 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.685736 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.685781 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.685806 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.685827 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.685853 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.685877 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.685970 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.686157 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.686233 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.686298 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.686331 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.686395 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.686428 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.686491 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.686521 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: W1204 17:26:29.692447 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.692523 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:29 crc kubenswrapper[4948]: W1204 17:26:29.787144 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:29 crc kubenswrapper[4948]: E1204 17:26:29.787269 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788364 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788537 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788587 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788653 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788609 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788750 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788818 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788764 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788851 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788888 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788913 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788922 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788930 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788956 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788972 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788854 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788988 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.788990 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789083 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789100 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789137 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789163 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789011 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789137 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789169 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789272 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789194 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789310 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789365 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.789465 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.844746 4948 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.944667 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.950975 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.965602 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: W1204 17:26:29.980897 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ff247497bade4c5d868ae70e139029ef0bd20f367336b87d1e7d955f249033f6 WatchSource:0}: Error finding container ff247497bade4c5d868ae70e139029ef0bd20f367336b87d1e7d955f249033f6: Status 404 returned error can't find the container with id ff247497bade4c5d868ae70e139029ef0bd20f367336b87d1e7d955f249033f6 Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.981511 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: W1204 17:26:29.981878 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e9859e9ecef5cff3630896caa5a366fd79a96343ff71fcff2802d8f077d7958c WatchSource:0}: Error finding container e9859e9ecef5cff3630896caa5a366fd79a96343ff71fcff2802d8f077d7958c: Status 404 returned error can't find the container with id e9859e9ecef5cff3630896caa5a366fd79a96343ff71fcff2802d8f077d7958c Dec 04 17:26:29 crc kubenswrapper[4948]: I1204 17:26:29.988637 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 17:26:29 crc kubenswrapper[4948]: W1204 17:26:29.992624 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b5c90bd6fc842e94a9745f8e78db54d715b54fa7dbbd88de7653863e40df0ae4 WatchSource:0}: Error finding container b5c90bd6fc842e94a9745f8e78db54d715b54fa7dbbd88de7653863e40df0ae4: Status 404 returned error can't find the container with id b5c90bd6fc842e94a9745f8e78db54d715b54fa7dbbd88de7653863e40df0ae4 Dec 04 17:26:30 crc kubenswrapper[4948]: W1204 17:26:30.004809 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fafd5e1dd37c9db93a2615395ecfd43e6042d14e784a65ecff3e6b097ef0bda6 WatchSource:0}: Error finding container fafd5e1dd37c9db93a2615395ecfd43e6042d14e784a65ecff3e6b097ef0bda6: Status 404 returned error can't find the container with id fafd5e1dd37c9db93a2615395ecfd43e6042d14e784a65ecff3e6b097ef0bda6 Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.024357 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.025589 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.025667 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.025687 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.025726 4948 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 17:26:30 crc kubenswrapper[4948]: E1204 17:26:30.026927 4948 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Dec 04 17:26:30 crc kubenswrapper[4948]: W1204 17:26:30.055164 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:30 crc kubenswrapper[4948]: E1204 17:26:30.055272 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:30 crc kubenswrapper[4948]: E1204 17:26:30.252276 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="1.6s" Dec 04 17:26:30 crc kubenswrapper[4948]: W1204 17:26:30.404295 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:30 crc kubenswrapper[4948]: E1204 17:26:30.404394 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.827976 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.844735 4948 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.897767 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.897824 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.897847 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.897893 4948 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 17:26:30 crc kubenswrapper[4948]: E1204 17:26:30.899478 4948 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.919536 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"17abb6a7163ec64d7fef31a6c106fd33062f53de580998dac89c2d7fb9528955"} Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.921126 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b5c90bd6fc842e94a9745f8e78db54d715b54fa7dbbd88de7653863e40df0ae4"} Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.922395 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e9859e9ecef5cff3630896caa5a366fd79a96343ff71fcff2802d8f077d7958c"} Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.923741 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ff247497bade4c5d868ae70e139029ef0bd20f367336b87d1e7d955f249033f6"} Dec 04 17:26:30 crc kubenswrapper[4948]: I1204 17:26:30.924760 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fafd5e1dd37c9db93a2615395ecfd43e6042d14e784a65ecff3e6b097ef0bda6"} Dec 04 17:26:31 crc kubenswrapper[4948]: W1204 17:26:31.569227 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:31 crc kubenswrapper[4948]: E1204 17:26:31.569683 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:31 crc kubenswrapper[4948]: E1204 17:26:31.746560 4948 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e132d33fedae7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 17:26:28.842609383 +0000 UTC m=+0.203683855,LastTimestamp:2025-12-04 17:26:28.842609383 +0000 UTC m=+0.203683855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 17:26:31 crc kubenswrapper[4948]: I1204 17:26:31.844912 4948 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:31 crc kubenswrapper[4948]: E1204 17:26:31.854171 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="3.2s" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.102616 4948 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2cf2c9f0e869d5c10ee6083196fe383ebb5ca5cc172de3066aabfbaad3d2e485" exitCode=0 Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.102774 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2cf2c9f0e869d5c10ee6083196fe383ebb5ca5cc172de3066aabfbaad3d2e485"} Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.102796 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.105008 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.105067 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.105081 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.105758 4948 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="793022cdd8fe378a4da69e9b518a6708938378c7822de47af73c5c8a4b245b71" exitCode=0 Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.105970 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"793022cdd8fe378a4da69e9b518a6708938378c7822de47af73c5c8a4b245b71"} Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.106016 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.108535 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.108675 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.108783 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.109758 4948 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2115fc167e63e48c9c99e3ec79ddc2b99458abb6c51f7559fcdf7f98ade49006" exitCode=0 Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.109939 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.110236 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2115fc167e63e48c9c99e3ec79ddc2b99458abb6c51f7559fcdf7f98ade49006"} Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.111663 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.111688 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.111703 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.112664 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd"} Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.115232 4948 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d" exitCode=0 Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.115273 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d"} Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.115354 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.116241 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.116279 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.116294 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.127847 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.133560 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.133618 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.133634 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:32 crc kubenswrapper[4948]: W1204 17:26:32.297737 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:32 crc kubenswrapper[4948]: E1204 17:26:32.297820 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:32 crc kubenswrapper[4948]: W1204 17:26:32.398185 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:32 crc kubenswrapper[4948]: E1204 17:26:32.398272 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.499818 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.501731 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.501797 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.501822 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.501866 4948 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 17:26:32 crc kubenswrapper[4948]: E1204 17:26:32.502665 4948 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.212:6443: connect: connection refused" node="crc" Dec 04 17:26:32 crc kubenswrapper[4948]: I1204 17:26:32.844761 4948 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.118857 4948 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5ac40bf482bbfd8a4bd79e244223ee767155bdff94313b8e8ae1eaf6fd19730f" exitCode=0 Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.118945 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5ac40bf482bbfd8a4bd79e244223ee767155bdff94313b8e8ae1eaf6fd19730f"} Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.119167 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.120319 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.120361 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.120378 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.124929 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d10f9125026292d73d742bbf8d8115af05f8eca223ff75d32f13bdb71d5f04e5"} Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.127172 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"730643e4ad9701eb8059a3452c8cf5a168b188076af4addb63cb51c55ea8e7b7"} Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.129254 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78"} Dec 04 17:26:33 crc kubenswrapper[4948]: I1204 17:26:33.131542 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b"} Dec 04 17:26:33 crc kubenswrapper[4948]: W1204 17:26:33.216268 4948 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.212:6443: connect: connection refused Dec 04 17:26:33 crc kubenswrapper[4948]: E1204 17:26:33.216393 4948 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.212:6443: connect: connection refused" logger="UnhandledError" Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.138438 4948 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="701df2cab69efa224e6f3a9414a0ee2384e8bb002f2935cf87fa0a373fa1c2d6" exitCode=0 Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.138526 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"701df2cab69efa224e6f3a9414a0ee2384e8bb002f2935cf87fa0a373fa1c2d6"} Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.138675 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.140029 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.140094 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.140108 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.141965 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0dbcf798728339180043bbfab93f421df35d045c10448c4064fcbfd7a481e67a"} Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.144475 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.144496 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce"} Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.145874 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.145900 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:34 crc kubenswrapper[4948]: I1204 17:26:34.145912 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.160705 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5"} Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.161321 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.162861 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.162922 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.162944 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.165340 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe"} Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.165377 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d"} Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.165393 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5"} Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.171587 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"11cf03ba4b94a975d09a672219cbaa57db17d1835ac4d80bf8c38567137c014e"} Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.171623 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"96aae72858e185bd3472edaeff3b2a000b63ac86c84667346909aeee39a98969"} Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.174086 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"813f784b6a6a58ec4ea9758b7f011983830719bddc81a5e20bfc9c65d885fbf2"} Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.174181 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.174971 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.175005 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.175024 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.702774 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.704458 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.704523 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.704544 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:35 crc kubenswrapper[4948]: I1204 17:26:35.704589 4948 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.184544 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7"} Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.184729 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.186665 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.186727 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.186742 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.190824 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3fa39052a1d3ed254d9ac18cf921b9e421708958e0d7dc4884bbea41a76dc59d"} Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.190898 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"668bf0321f602efe845f086f5da8ff704940dc17fd3b866532b5d77712f61351"} Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.190861 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.191023 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.191126 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.192499 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.192525 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.192596 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.192624 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.192565 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:36 crc kubenswrapper[4948]: I1204 17:26:36.192701 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.193420 4948 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.193450 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.193474 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.194780 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.194828 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.194844 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.194885 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.194911 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:37 crc kubenswrapper[4948]: I1204 17:26:37.194924 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:38 crc kubenswrapper[4948]: I1204 17:26:38.201399 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e4e5a21c26462e2b8093f3a66ef9bf0e02e66c4550fc6f399abfec49c54b39a4"} Dec 04 17:26:38 crc kubenswrapper[4948]: I1204 17:26:38.201593 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:38 crc kubenswrapper[4948]: I1204 17:26:38.202655 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:38 crc kubenswrapper[4948]: I1204 17:26:38.202704 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:38 crc kubenswrapper[4948]: I1204 17:26:38.202723 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:38 crc kubenswrapper[4948]: I1204 17:26:38.547020 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.177482 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.177677 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.180011 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.180103 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.180121 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.185462 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.204529 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.204619 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.204618 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.206079 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.206112 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.206128 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.206086 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.206238 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.206269 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:39 crc kubenswrapper[4948]: E1204 17:26:39.316129 4948 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.645353 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.645618 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.647371 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.647432 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.647457 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.690488 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:39 crc kubenswrapper[4948]: I1204 17:26:39.945217 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.208319 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.208400 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.208409 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.209930 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.209975 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.209989 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.210269 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.210330 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.210349 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.210480 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.210534 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.210553 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:40 crc kubenswrapper[4948]: I1204 17:26:40.780360 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.231623 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.231722 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.240022 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.241128 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.241160 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.241217 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.241240 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.241253 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:41 crc kubenswrapper[4948]: I1204 17:26:41.241275 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:42 crc kubenswrapper[4948]: I1204 17:26:42.234274 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:42 crc kubenswrapper[4948]: I1204 17:26:42.235606 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:42 crc kubenswrapper[4948]: I1204 17:26:42.235657 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:42 crc kubenswrapper[4948]: I1204 17:26:42.235671 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:43 crc kubenswrapper[4948]: I1204 17:26:43.845550 4948 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.330461 4948 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.330521 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.580034 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.580260 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.581651 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.581692 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.581704 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.951948 4948 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]log ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]etcd ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/generic-apiserver-start-informers ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/priority-and-fairness-filter ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-apiextensions-informers ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-apiextensions-controllers ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/crd-informer-synced ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-system-namespaces-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 04 17:26:44 crc kubenswrapper[4948]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 04 17:26:44 crc kubenswrapper[4948]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/bootstrap-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/start-kube-aggregator-informers ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/apiservice-registration-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/apiservice-discovery-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]autoregister-completion ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/apiservice-openapi-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 04 17:26:44 crc kubenswrapper[4948]: livez check failed Dec 04 17:26:44 crc kubenswrapper[4948]: I1204 17:26:44.952011 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.223275 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.223609 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.225612 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.225673 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.225694 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.257176 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.257413 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.258839 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.258903 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.258920 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:45 crc kubenswrapper[4948]: I1204 17:26:45.270273 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 04 17:26:46 crc kubenswrapper[4948]: I1204 17:26:46.250334 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:46 crc kubenswrapper[4948]: I1204 17:26:46.252118 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:46 crc kubenswrapper[4948]: I1204 17:26:46.252193 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:46 crc kubenswrapper[4948]: I1204 17:26:46.252214 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:47 crc kubenswrapper[4948]: I1204 17:26:47.580724 4948 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 17:26:47 crc kubenswrapper[4948]: I1204 17:26:47.580821 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 17:26:48 crc kubenswrapper[4948]: I1204 17:26:48.911974 4948 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 04 17:26:48 crc kubenswrapper[4948]: I1204 17:26:48.912037 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 04 17:26:49 crc kubenswrapper[4948]: E1204 17:26:49.316291 4948 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 17:26:49 crc kubenswrapper[4948]: E1204 17:26:49.321846 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 04 17:26:49 crc kubenswrapper[4948]: I1204 17:26:49.330337 4948 trace.go:236] Trace[1538864543]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 17:26:38.866) (total time: 10463ms): Dec 04 17:26:49 crc kubenswrapper[4948]: Trace[1538864543]: ---"Objects listed" error: 10463ms (17:26:49.330) Dec 04 17:26:49 crc kubenswrapper[4948]: Trace[1538864543]: [10.463691361s] [10.463691361s] END Dec 04 17:26:49 crc kubenswrapper[4948]: I1204 17:26:49.330382 4948 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.175790 4948 trace.go:236] Trace[2024615724]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 17:26:36.847) (total time: 13328ms): Dec 04 17:26:50 crc kubenswrapper[4948]: Trace[2024615724]: ---"Objects listed" error: 13328ms (17:26:50.175) Dec 04 17:26:50 crc kubenswrapper[4948]: Trace[2024615724]: [13.328578377s] [13.328578377s] END Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.175858 4948 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.178393 4948 trace.go:236] Trace[1939238869]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 17:26:38.039) (total time: 12138ms): Dec 04 17:26:50 crc kubenswrapper[4948]: Trace[1939238869]: ---"Objects listed" error: 12138ms (17:26:50.178) Dec 04 17:26:50 crc kubenswrapper[4948]: Trace[1939238869]: [12.138861531s] [12.138861531s] END Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.178452 4948 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.178761 4948 apiserver.go:52] "Watching apiserver" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.179377 4948 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.179546 4948 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.180541 4948 trace.go:236] Trace[1966936848]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 17:26:34.828) (total time: 15352ms): Dec 04 17:26:50 crc kubenswrapper[4948]: Trace[1966936848]: ---"Objects listed" error: 15351ms (17:26:50.180) Dec 04 17:26:50 crc kubenswrapper[4948]: Trace[1966936848]: [15.352071802s] [15.352071802s] END Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.180614 4948 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.188854 4948 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.189221 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.189675 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.189826 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.189960 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.190238 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.190293 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.190314 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.190368 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.190428 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.190432 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.197208 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.197872 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.198191 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.199755 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.199879 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.200535 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.209253 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.209470 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.209867 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.217361 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.218156 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.248231 4948 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.262886 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280261 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280398 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280447 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280478 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280517 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280547 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280577 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280608 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280639 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280715 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280725 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280747 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280794 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280817 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280837 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280854 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280870 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280885 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280903 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280928 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280947 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280962 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280975 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.280989 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281002 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281018 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281016 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281036 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281074 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281106 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281121 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281136 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281150 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281165 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281181 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281198 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281214 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281230 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281245 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281263 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281281 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281296 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281310 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281317 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281335 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281351 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281367 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281396 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281411 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281426 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281427 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281442 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281554 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281591 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281620 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281646 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281674 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281701 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281735 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281761 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281785 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281808 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281833 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281862 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281887 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281895 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281912 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281939 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281964 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281986 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.281988 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282010 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282033 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282073 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282096 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282121 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282145 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282171 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282195 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282218 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282243 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282274 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282297 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282319 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282343 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282367 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282391 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282415 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282437 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282459 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282480 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282500 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282523 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282595 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282615 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282634 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282655 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282710 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282735 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282763 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282784 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282814 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282836 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282856 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282877 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282895 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282916 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282965 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282989 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283011 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283032 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283069 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283092 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283110 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283137 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283162 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283186 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283209 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283233 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283257 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283282 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283306 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283326 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283348 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283374 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283399 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283421 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283442 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283461 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283483 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283506 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283529 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283550 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283574 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283600 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283626 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283651 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283674 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283700 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283725 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283750 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283778 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283804 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282098 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282145 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282253 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282299 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282328 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282391 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282524 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282521 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282597 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282690 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282702 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282832 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282852 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.282995 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283016 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283197 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283346 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283355 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283417 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283482 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283571 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.284039 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283598 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283692 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283748 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283784 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283822 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.284805 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.284926 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.284930 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.284934 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.285177 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.285237 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.285259 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.285315 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.285556 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286112 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286416 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286517 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286577 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286704 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.283828 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286858 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286871 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286879 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286893 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.286996 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287023 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287065 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287088 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287120 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287146 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287171 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287194 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287218 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287240 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287263 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287285 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287307 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287329 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287353 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287375 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287396 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287418 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287443 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287464 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287485 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287506 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287528 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287549 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287572 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287595 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287620 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287643 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287666 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287690 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287711 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287746 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287770 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287791 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287872 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287897 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287919 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287942 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287966 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287989 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288014 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288036 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288078 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288100 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288123 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288148 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288175 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288198 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288220 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288242 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288263 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288286 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288308 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288331 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288354 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288378 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288425 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288455 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288478 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288502 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288529 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288552 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288576 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288602 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288633 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288658 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288682 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288707 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288730 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288753 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320405 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320433 4948 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320451 4948 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320463 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320478 4948 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320489 4948 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320501 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320511 4948 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320524 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320533 4948 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320710 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320728 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320739 4948 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320751 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320762 4948 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320775 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.320785 4948 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332795 4948 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332823 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332843 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332868 4948 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332882 4948 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332896 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332908 4948 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332925 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332939 4948 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287190 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287279 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287294 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287308 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287537 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287602 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287723 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287725 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.287925 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288111 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288213 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288277 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288357 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288351 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288499 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288583 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288619 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.288733 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.323184 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.325669 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.326099 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.333751 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.326109 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.326207 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.326874 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.327318 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.327716 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.327994 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.328232 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.328460 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.328685 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.328883 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.330412 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.331169 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.331269 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.331640 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.331935 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.332719 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.333354 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.334056 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.334151 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.334236 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.334260 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.334414 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.334514 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.334877 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.335024 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.335229 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.335315 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.335574 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.335610 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.335801 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.336314 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.336556 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.336752 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.336769 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.336831 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.337364 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.337421 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.337742 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.337814 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.337950 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.338877 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.339009 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.339235 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.339271 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.339496 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.339564 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.339841 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340016 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340238 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340253 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340419 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340482 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.340541 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:26:50.840516265 +0000 UTC m=+22.201590667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340577 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340733 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340847 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.340873 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.341302 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.342762 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.344025 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.344209 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.344533 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.344943 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.345435 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.345871 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.346556 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.347906 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.348447 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.348653 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.348813 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.349027 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.349986 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350114 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350233 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350322 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350387 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350526 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350621 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350643 4948 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350671 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350846 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350919 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350928 4948 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351211 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.355103 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357772 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357810 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357828 4948 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357842 4948 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357857 4948 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357876 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357895 4948 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357908 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357920 4948 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357938 4948 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357952 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357969 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357983 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358004 4948 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358019 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358032 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358073 4948 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358088 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358103 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358117 4948 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358136 4948 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358153 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.350991 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351109 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351192 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351377 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351413 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351502 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351530 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351628 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351671 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351757 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351760 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351860 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.351935 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.352017 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.352089 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.352162 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.352246 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.352442 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.352469 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.352840 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.352850 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.353105 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.353348 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.354562 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.354765 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.354821 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.354899 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.354969 4948 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.358613 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:50.858590603 +0000 UTC m=+22.219665015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.355126 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.355392 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.355695 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.356177 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.356539 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357212 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357221 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357580 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.357608 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.357679 4948 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.358765 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:50.858753357 +0000 UTC m=+22.219827759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.358127 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.359144 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.369011 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.377066 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.378222 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.381764 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.381798 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.381824 4948 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.381918 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:50.881891286 +0000 UTC m=+22.242965688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.384314 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.388659 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.391620 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.391676 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.391703 4948 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.391800 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:50.891766472 +0000 UTC m=+22.252840874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.394626 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.400532 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.409951 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.412979 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.428259 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459022 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459104 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459167 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459179 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459189 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459199 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459212 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459221 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459234 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459260 4948 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459276 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459331 4948 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459344 4948 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459359 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459369 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459381 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459391 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459264 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459453 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459534 4948 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459552 4948 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459564 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459575 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459609 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459626 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459656 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459667 4948 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459682 4948 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459693 4948 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459702 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459715 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459724 4948 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459734 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459743 4948 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459766 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459776 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459785 4948 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459794 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459807 4948 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459817 4948 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459827 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459840 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459850 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459860 4948 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459870 4948 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459883 4948 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459892 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459902 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459913 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459925 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459933 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459944 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459954 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459970 4948 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459982 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.459994 4948 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460009 4948 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460021 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460032 4948 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460060 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460077 4948 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460091 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460103 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460116 4948 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460134 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460147 4948 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460160 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460176 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460191 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460204 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460219 4948 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460236 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460265 4948 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460296 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460310 4948 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460327 4948 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460340 4948 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460352 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460366 4948 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460385 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460399 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460412 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460431 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460442 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460455 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460467 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460486 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460499 4948 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460510 4948 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460523 4948 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460540 4948 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460552 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460565 4948 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460578 4948 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460594 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460608 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460621 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460637 4948 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460649 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460661 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460672 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460688 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460703 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460717 4948 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460730 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460745 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460756 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460768 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460783 4948 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460795 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460807 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460820 4948 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460835 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460846 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460858 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460871 4948 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460886 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460900 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460912 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460924 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460941 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460953 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460964 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460984 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.460997 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461010 4948 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461022 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461055 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461070 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461082 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461095 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461112 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461126 4948 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461139 4948 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461154 4948 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461165 4948 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461176 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461188 4948 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461203 4948 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461217 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461229 4948 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461242 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461259 4948 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461271 4948 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.461284 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.527545 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.535196 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.561836 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.838854 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.840305 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.847902 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.848012 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.864401 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.864533 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.864614 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:26:51.864572063 +0000 UTC m=+23.225646495 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.864646 4948 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.864701 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:51.864686436 +0000 UTC m=+23.225760838 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.864736 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.864775 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.864850 4948 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.864882 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:51.864875951 +0000 UTC m=+23.225950343 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.884228 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.887697 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.888570 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.929809 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.930467 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.932120 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.932882 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.934116 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.937523 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.938504 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.940192 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.948540 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.965346 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.965399 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.965424 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: I1204 17:26:50.965435 4948 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.965537 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.965550 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.965561 4948 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.965570 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.965604 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:51.965591538 +0000 UTC m=+23.326665930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.965607 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.965631 4948 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:50 crc kubenswrapper[4948]: E1204 17:26:50.965696 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:51.96567361 +0000 UTC m=+23.326748012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.138437 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.293005 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.294636 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.297014 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.298812 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.299919 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.301928 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.303193 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.305216 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.306504 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.307390 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.309407 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.310788 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.311705 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.314444 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.315433 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.316297 4948 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.317297 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.318223 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.320588 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.320807 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.322579 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.323581 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.325857 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.326937 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.332034 4948 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.332198 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.334570 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.335935 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.336511 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.338687 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.340190 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.340906 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.342346 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.343311 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.344529 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.344530 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.345377 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.346771 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.347675 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.348976 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.349736 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.350957 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.352184 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.353692 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.354565 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.355883 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.355941 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.356771 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.357730 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.359197 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.359976 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"023bbbe90e204eec8282895cd79d003d7ceb789bfc64b6e2d245107a38b88212"} Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.360083 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6306ae854e267f18628adbadcf6b2c8decd403e0f13ddacadbcc7c9907eba986"} Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.360120 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.372693 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.393936 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.408252 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.419657 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.880127 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.880267 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.880297 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.880417 4948 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.880488 4948 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.880496 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:26:53.880448243 +0000 UTC m=+25.241522656 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.880632 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:53.880605528 +0000 UTC m=+25.241680110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.880661 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:53.880652069 +0000 UTC m=+25.241726701 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.913345 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.913586 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.913377 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.913351 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.913738 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.913965 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.981391 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:51 crc kubenswrapper[4948]: I1204 17:26:51.981469 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.981678 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.981735 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.981755 4948 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.981831 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:53.981804387 +0000 UTC m=+25.342878819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.981681 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.981879 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.981895 4948 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:51 crc kubenswrapper[4948]: E1204 17:26:51.981939 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:53.98192641 +0000 UTC m=+25.343000842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.268709 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e3ddcbf833a8be3e81d7136c524ada5a10162b07d755f6504b9a42689ee2fee2"} Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.866400 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hfvn4"] Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.871460 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ql2z6"] Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.871981 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.872162 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ql2z6" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.872029 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2gnsr"] Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.875145 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lz7z7"] Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.875512 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.876350 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.882469 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.882528 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.882779 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.882854 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.883134 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.883194 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.883319 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.883368 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.883428 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.889514 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.899956 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.901210 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.902188 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.906581 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.906812 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.916899 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4cnmm"] Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.917937 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.921020 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.922209 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.922658 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.922866 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.922986 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.924752 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.924754 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.925832 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.945559 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.974946 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990310 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990349 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-cni-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990367 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-rootfs\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990382 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990400 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-systemd-units\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990415 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-cnibin\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990430 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-netns\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990532 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-kubelet\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990594 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-multus-certs\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990636 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-script-lib\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990667 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mph6j\" (UniqueName: \"kubernetes.io/projected/8149892b-eb94-4d2d-99b3-cebf34efa32a-kube-api-access-mph6j\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990742 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-os-release\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990794 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-proxy-tls\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990843 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-netns\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990885 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-socket-dir-parent\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990918 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-cni-multus\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.990937 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4014d19-9310-4326-81ae-dd5d03df6311-hosts-file\") pod \"node-resolver-ql2z6\" (UID: \"b4014d19-9310-4326-81ae-dd5d03df6311\") " pod="openshift-dns/node-resolver-ql2z6" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991014 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-node-log\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991062 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991089 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovn-node-metrics-cert\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991105 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-hostroot\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991123 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cnibin\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991139 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-os-release\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991159 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-etc-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991177 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-k8s-cni-cncf-io\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991195 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rhh5\" (UniqueName: \"kubernetes.io/projected/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-kube-api-access-2rhh5\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991214 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-bin\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991233 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpt4n\" (UniqueName: \"kubernetes.io/projected/cda64a2b-9444-49d3-bee6-21e8c2bae502-kube-api-access-xpt4n\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991252 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bslgt\" (UniqueName: \"kubernetes.io/projected/b4014d19-9310-4326-81ae-dd5d03df6311-kube-api-access-bslgt\") pod \"node-resolver-ql2z6\" (UID: \"b4014d19-9310-4326-81ae-dd5d03df6311\") " pod="openshift-dns/node-resolver-ql2z6" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991316 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-log-socket\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991352 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-config\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991380 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-env-overrides\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991414 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xzb\" (UniqueName: \"kubernetes.io/projected/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-kube-api-access-86xzb\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991441 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-systemd\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991467 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-var-lib-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991489 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991515 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991535 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-slash\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991556 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cda64a2b-9444-49d3-bee6-21e8c2bae502-cni-binary-copy\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991575 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-etc-kubernetes\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991598 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-ovn\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991620 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-kubelet\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991642 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-system-cni-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991667 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-conf-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991688 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-daemon-config\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991735 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-system-cni-dir\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991764 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991826 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-netd\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991850 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-cni-bin\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:52 crc kubenswrapper[4948]: I1204 17:26:52.991872 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-mcd-auth-proxy-config\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.002674 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.021695 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.035298 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.043447 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.052814 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.062296 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.081290 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093156 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-log-socket\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093219 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-config\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093238 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-env-overrides\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093263 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xzb\" (UniqueName: \"kubernetes.io/projected/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-kube-api-access-86xzb\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093280 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-systemd\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093295 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-var-lib-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093311 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093309 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-log-socket\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093336 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093392 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-var-lib-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093402 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-slash\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093451 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093453 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cda64a2b-9444-49d3-bee6-21e8c2bae502-cni-binary-copy\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093477 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-etc-kubernetes\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093496 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-ovn\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093513 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-system-cni-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093529 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-conf-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093523 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-systemd\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093546 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-daemon-config\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093629 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-kubelet\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093660 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-system-cni-dir\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093686 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093709 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-cni-bin\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093728 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-mcd-auth-proxy-config\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093749 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-netd\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093773 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093794 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-cni-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093813 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-rootfs\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093834 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-systemd-units\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094283 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094363 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-ovn\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094362 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-cnibin\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093853 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-cnibin\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094406 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-etc-kubernetes\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094389 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-env-overrides\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094518 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-system-cni-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094547 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-daemon-config\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094605 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-conf-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094647 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-system-cni-dir\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094705 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-systemd-units\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.093424 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-slash\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094720 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-rootfs\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094750 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-cni-dir\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094754 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-kubelet\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094660 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094788 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-cni-bin\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094803 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-netd\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094910 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-mcd-auth-proxy-config\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094953 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-netns\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.094964 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-netns\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095017 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-kubelet\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095078 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-multus-certs\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095059 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-config\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095107 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-kubelet\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095129 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095133 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-multus-certs\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095167 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-script-lib\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095229 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mph6j\" (UniqueName: \"kubernetes.io/projected/8149892b-eb94-4d2d-99b3-cebf34efa32a-kube-api-access-mph6j\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095267 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-os-release\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095289 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-proxy-tls\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095292 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095314 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-netns\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095369 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-socket-dir-parent\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095409 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-cni-multus\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095444 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4014d19-9310-4326-81ae-dd5d03df6311-hosts-file\") pod \"node-resolver-ql2z6\" (UID: \"b4014d19-9310-4326-81ae-dd5d03df6311\") " pod="openshift-dns/node-resolver-ql2z6" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095472 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-node-log\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095490 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095514 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovn-node-metrics-cert\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095536 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-hostroot\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095562 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cnibin\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095584 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-os-release\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095604 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-etc-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095627 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-k8s-cni-cncf-io\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095652 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rhh5\" (UniqueName: \"kubernetes.io/projected/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-kube-api-access-2rhh5\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095679 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpt4n\" (UniqueName: \"kubernetes.io/projected/cda64a2b-9444-49d3-bee6-21e8c2bae502-kube-api-access-xpt4n\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095705 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bslgt\" (UniqueName: \"kubernetes.io/projected/b4014d19-9310-4326-81ae-dd5d03df6311-kube-api-access-bslgt\") pod \"node-resolver-ql2z6\" (UID: \"b4014d19-9310-4326-81ae-dd5d03df6311\") " pod="openshift-dns/node-resolver-ql2z6" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095734 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-bin\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095828 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-bin\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095865 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-ovn-kubernetes\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096220 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-script-lib\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096315 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-run-k8s-cni-cncf-io\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096326 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-cnibin\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.095652 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-node-log\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096386 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-etc-openvswitch\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096396 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-os-release\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096447 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-hostroot\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096614 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-multus-socket-dir-parent\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096658 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-netns\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096671 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-host-var-lib-cni-multus\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096696 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b4014d19-9310-4326-81ae-dd5d03df6311-hosts-file\") pod \"node-resolver-ql2z6\" (UID: \"b4014d19-9310-4326-81ae-dd5d03df6311\") " pod="openshift-dns/node-resolver-ql2z6" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.096810 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cda64a2b-9444-49d3-bee6-21e8c2bae502-os-release\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.098490 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cda64a2b-9444-49d3-bee6-21e8c2bae502-cni-binary-copy\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.101533 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.103078 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovn-node-metrics-cert\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.104195 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-proxy-tls\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.112252 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.118671 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mph6j\" (UniqueName: \"kubernetes.io/projected/8149892b-eb94-4d2d-99b3-cebf34efa32a-kube-api-access-mph6j\") pod \"ovnkube-node-4cnmm\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.119099 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rhh5\" (UniqueName: \"kubernetes.io/projected/9c5bb3e4-2f5a-47d7-a998-be50d1429cb2-kube-api-access-2rhh5\") pod \"machine-config-daemon-hfvn4\" (UID: \"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\") " pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.120836 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpt4n\" (UniqueName: \"kubernetes.io/projected/cda64a2b-9444-49d3-bee6-21e8c2bae502-kube-api-access-xpt4n\") pod \"multus-lz7z7\" (UID: \"cda64a2b-9444-49d3-bee6-21e8c2bae502\") " pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.121869 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bslgt\" (UniqueName: \"kubernetes.io/projected/b4014d19-9310-4326-81ae-dd5d03df6311-kube-api-access-bslgt\") pod \"node-resolver-ql2z6\" (UID: \"b4014d19-9310-4326-81ae-dd5d03df6311\") " pod="openshift-dns/node-resolver-ql2z6" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.123569 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.128081 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xzb\" (UniqueName: \"kubernetes.io/projected/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-kube-api-access-86xzb\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.131270 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.143677 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.158993 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.174518 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.186762 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.191853 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.197267 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.200459 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ql2z6" Dec 04 17:26:53 crc kubenswrapper[4948]: W1204 17:26:53.203224 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c5bb3e4_2f5a_47d7_a998_be50d1429cb2.slice/crio-ded932b523e371dca262da4c9264ea48779ccef3842f4ecea03bca71c8306777 WatchSource:0}: Error finding container ded932b523e371dca262da4c9264ea48779ccef3842f4ecea03bca71c8306777: Status 404 returned error can't find the container with id ded932b523e371dca262da4c9264ea48779ccef3842f4ecea03bca71c8306777 Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.208484 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.210611 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4170d85e-dba9-4cc0-8183-2b16aa4f43e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2gnsr\" (UID: \"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\") " pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.212082 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lz7z7" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.224466 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.225604 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.238384 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.285923 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"e48d84f8b3d426f09f1bab699d96da730be760e50d79a9cd9d503beb7952d1d7"} Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.289595 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ql2z6" event={"ID":"b4014d19-9310-4326-81ae-dd5d03df6311","Type":"ContainerStarted","Data":"0cbef291ab2b7b43615d6179d84723242f5cc0c0cf681fedaea82ad3483f3b73"} Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.290102 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lz7z7" event={"ID":"cda64a2b-9444-49d3-bee6-21e8c2bae502","Type":"ContainerStarted","Data":"4ee021e3ff97c9b64a4bd6223b4c36ed0abe871e895284f60878ccb76a3af6fd"} Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.292155 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"ded932b523e371dca262da4c9264ea48779ccef3842f4ecea03bca71c8306777"} Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.903626 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.903804 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.903884 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:53 crc kubenswrapper[4948]: E1204 17:26:53.904128 4948 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:53 crc kubenswrapper[4948]: E1204 17:26:53.904157 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:26:57.904119965 +0000 UTC m=+29.265194397 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:26:53 crc kubenswrapper[4948]: E1204 17:26:53.904283 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:57.904258668 +0000 UTC m=+29.265333100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:53 crc kubenswrapper[4948]: E1204 17:26:53.904282 4948 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:53 crc kubenswrapper[4948]: E1204 17:26:53.904487 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:57.904459494 +0000 UTC m=+29.265533906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.914322 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.914573 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:53 crc kubenswrapper[4948]: E1204 17:26:53.914731 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:26:53 crc kubenswrapper[4948]: I1204 17:26:53.914496 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:53 crc kubenswrapper[4948]: E1204 17:26:53.914977 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:26:53 crc kubenswrapper[4948]: E1204 17:26:53.915091 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.004954 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.004998 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:54 crc kubenswrapper[4948]: E1204 17:26:54.005138 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:54 crc kubenswrapper[4948]: E1204 17:26:54.005152 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:54 crc kubenswrapper[4948]: E1204 17:26:54.005162 4948 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:54 crc kubenswrapper[4948]: E1204 17:26:54.005164 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:54 crc kubenswrapper[4948]: E1204 17:26:54.005192 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:54 crc kubenswrapper[4948]: E1204 17:26:54.005223 4948 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:54 crc kubenswrapper[4948]: E1204 17:26:54.005204 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:58.005191562 +0000 UTC m=+29.366265964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:54 crc kubenswrapper[4948]: E1204 17:26:54.005296 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 17:26:58.005277344 +0000 UTC m=+29.366351766 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.294651 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"da81744a923b3e35e7b7416c7997a78d504b43adaf9cee467ddcceda8cfe6b33"} Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.296976 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"af1fde01f3f8f1cb71786aab9558135bf1074bbdc85e280dd3028e0b2c7d5350"} Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.298816 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerStarted","Data":"670c85d5f56ce1a8e5529689f6f2b8c1a4afc3856cb0b7ba054b810d5d5402c1"} Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.631501 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.635789 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.641024 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.643900 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.654971 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.671835 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.687450 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.700221 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.712722 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.722315 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.732795 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.742280 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.753468 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.763534 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.775685 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.785420 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.794601 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.801701 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.812851 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.828371 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.839943 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.848331 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.862907 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.873921 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.884995 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.895136 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86166768-c599-43d8-82b3-3f8752ade673\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.903458 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:54 crc kubenswrapper[4948]: I1204 17:26:54.912257 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.303570 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lz7z7" event={"ID":"cda64a2b-9444-49d3-bee6-21e8c2bae502","Type":"ContainerStarted","Data":"43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7"} Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.305289 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"8e811b8000b0a1451742559953ae4b8ceaef08af55bb4663a9967a43362e5d3b"} Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.306698 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.321312 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.336570 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.349087 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.363253 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.394168 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.409086 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.423915 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.433765 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.444523 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.454316 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.463694 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.476710 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86166768-c599-43d8-82b3-3f8752ade673\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.486515 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81744a923b3e35e7b7416c7997a78d504b43adaf9cee467ddcceda8cfe6b33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.913152 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.913275 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:55 crc kubenswrapper[4948]: I1204 17:26:55.913317 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:55 crc kubenswrapper[4948]: E1204 17:26:55.914205 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:26:55 crc kubenswrapper[4948]: E1204 17:26:55.914344 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:26:55 crc kubenswrapper[4948]: E1204 17:26:55.914560 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.148095 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-q7rfn"] Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.148732 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.150620 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.150834 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.151203 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.151247 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.160356 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q7rfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da8fe0b9-424c-4361-b63a-5631e21f8fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hc52g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q7rfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.173986 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81744a923b3e35e7b7416c7997a78d504b43adaf9cee467ddcceda8cfe6b33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.185245 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.196100 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.205355 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.214953 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.226227 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da8fe0b9-424c-4361-b63a-5631e21f8fb4-serviceca\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.226271 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da8fe0b9-424c-4361-b63a-5631e21f8fb4-host\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.226311 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc52g\" (UniqueName: \"kubernetes.io/projected/da8fe0b9-424c-4361-b63a-5631e21f8fb4-kube-api-access-hc52g\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.230668 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.242004 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.252892 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.263267 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.278936 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.290909 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.304939 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.310528 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ql2z6" event={"ID":"b4014d19-9310-4326-81ae-dd5d03df6311","Type":"ContainerStarted","Data":"8ba6c4514cc1e008ac36c8b64e24f3ab24d07e8152e5bf10a30702d74b4a2d71"} Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.311798 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerStarted","Data":"00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441"} Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.313544 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6eaa6d228a485c6a98bde5ff43ee9e40255b060989a4be1f4c310355bc74607f"} Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.315303 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78" exitCode=0 Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.315369 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.316685 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86166768-c599-43d8-82b3-3f8752ade673\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.326989 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da8fe0b9-424c-4361-b63a-5631e21f8fb4-serviceca\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.327033 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da8fe0b9-424c-4361-b63a-5631e21f8fb4-host\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.327101 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc52g\" (UniqueName: \"kubernetes.io/projected/da8fe0b9-424c-4361-b63a-5631e21f8fb4-kube-api-access-hc52g\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.327276 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da8fe0b9-424c-4361-b63a-5631e21f8fb4-host\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.327374 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.328593 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/da8fe0b9-424c-4361-b63a-5631e21f8fb4-serviceca\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.335699 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.357090 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc52g\" (UniqueName: \"kubernetes.io/projected/da8fe0b9-424c-4361-b63a-5631e21f8fb4-kube-api-access-hc52g\") pod \"node-ca-q7rfn\" (UID: \"da8fe0b9-424c-4361-b63a-5631e21f8fb4\") " pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.386210 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.412606 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.434322 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.446142 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.454517 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.464244 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q7rfn" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.467085 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.479962 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.488079 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86166768-c599-43d8-82b3-3f8752ade673\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.494899 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.504647 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.513737 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81744a923b3e35e7b7416c7997a78d504b43adaf9cee467ddcceda8cfe6b33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.520325 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q7rfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da8fe0b9-424c-4361-b63a-5631e21f8fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hc52g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q7rfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.580521 4948 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.582902 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.582938 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.582950 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.583085 4948 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.589893 4948 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.590161 4948 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.591427 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.591481 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.591492 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.591512 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.591523 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:56 crc kubenswrapper[4948]: E1204 17:26:56.605816 4948 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2c510825-90fc-4792-8c77-c6294ad916fc\\\",\\\"systemUUID\\\":\\\"a01eebc5-0a1d-4af1-abcb-67984481a255\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:56Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.609253 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.609289 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.609299 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.609321 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.609330 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:56 crc kubenswrapper[4948]: E1204 17:26:56.620411 4948 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2c510825-90fc-4792-8c77-c6294ad916fc\\\",\\\"systemUUID\\\":\\\"a01eebc5-0a1d-4af1-abcb-67984481a255\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:56Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.623451 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.623482 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.623491 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.623505 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.623515 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:56 crc kubenswrapper[4948]: E1204 17:26:56.634217 4948 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2c510825-90fc-4792-8c77-c6294ad916fc\\\",\\\"systemUUID\\\":\\\"a01eebc5-0a1d-4af1-abcb-67984481a255\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:56Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.637092 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.637127 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.637137 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.637153 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.637164 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:56 crc kubenswrapper[4948]: E1204 17:26:56.648442 4948 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2c510825-90fc-4792-8c77-c6294ad916fc\\\",\\\"systemUUID\\\":\\\"a01eebc5-0a1d-4af1-abcb-67984481a255\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:56Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.651518 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.651557 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.651566 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.651581 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.651592 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:56 crc kubenswrapper[4948]: E1204 17:26:56.672467 4948 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2c510825-90fc-4792-8c77-c6294ad916fc\\\",\\\"systemUUID\\\":\\\"a01eebc5-0a1d-4af1-abcb-67984481a255\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:56Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:56 crc kubenswrapper[4948]: E1204 17:26:56.672583 4948 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.675626 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.675657 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.675670 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.675687 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.675700 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.779305 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.779360 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.779374 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.779397 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.779414 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.885664 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.885774 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.885807 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.885837 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.885858 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.988692 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.988731 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.988743 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.988758 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:56 crc kubenswrapper[4948]: I1204 17:26:56.988769 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:56Z","lastTransitionTime":"2025-12-04T17:26:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.091529 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.092166 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.092269 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.092399 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.092486 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.196335 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.196456 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.196476 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.196502 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.196520 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.300546 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.300634 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.300652 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.300677 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.300695 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.320652 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q7rfn" event={"ID":"da8fe0b9-424c-4361-b63a-5631e21f8fb4","Type":"ContainerStarted","Data":"02b00a4ba174bfcbcf7ccb3a97b186d6265c2df83f9ffd0643c1e9e30777a151"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.338571 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.355878 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.385467 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.402467 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.403685 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.403725 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.403742 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.403763 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.403777 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.423996 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.442673 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.465308 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.489337 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.506885 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.506957 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.506980 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.507014 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.507037 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.509765 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.531395 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86166768-c599-43d8-82b3-3f8752ade673\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.548641 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.568166 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.586906 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81744a923b3e35e7b7416c7997a78d504b43adaf9cee467ddcceda8cfe6b33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.605156 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q7rfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da8fe0b9-424c-4361-b63a-5631e21f8fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hc52g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q7rfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:57Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.610019 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.610106 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.610123 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.610146 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.610165 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.712951 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.713012 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.713030 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.713118 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.713138 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.816595 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.816654 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.816689 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.816719 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.816740 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.912721 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.912721 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:57 crc kubenswrapper[4948]: E1204 17:26:57.912872 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.913114 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:57 crc kubenswrapper[4948]: E1204 17:26:57.913266 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:26:57 crc kubenswrapper[4948]: E1204 17:26:57.913397 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.924896 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.924969 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.924992 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.925020 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.925083 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:57Z","lastTransitionTime":"2025-12-04T17:26:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.948797 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.948942 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:57 crc kubenswrapper[4948]: E1204 17:26:57.949026 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:05.948982554 +0000 UTC m=+37.310056996 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:26:57 crc kubenswrapper[4948]: E1204 17:26:57.949136 4948 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:57 crc kubenswrapper[4948]: I1204 17:26:57.949161 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:57 crc kubenswrapper[4948]: E1204 17:26:57.949224 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:05.94920007 +0000 UTC m=+37.310274512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:26:57 crc kubenswrapper[4948]: E1204 17:26:57.949271 4948 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:57 crc kubenswrapper[4948]: E1204 17:26:57.949356 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:05.949332933 +0000 UTC m=+37.310407365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.031515 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.031628 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.031653 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.031686 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.031709 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.051197 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.051297 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:58 crc kubenswrapper[4948]: E1204 17:26:58.051557 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:58 crc kubenswrapper[4948]: E1204 17:26:58.051608 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:58 crc kubenswrapper[4948]: E1204 17:26:58.051633 4948 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:58 crc kubenswrapper[4948]: E1204 17:26:58.051720 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:06.051692003 +0000 UTC m=+37.412766455 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:58 crc kubenswrapper[4948]: E1204 17:26:58.051854 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:26:58 crc kubenswrapper[4948]: E1204 17:26:58.051884 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:26:58 crc kubenswrapper[4948]: E1204 17:26:58.051954 4948 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:58 crc kubenswrapper[4948]: E1204 17:26:58.052016 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:06.051995451 +0000 UTC m=+37.413069883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.134634 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.134664 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.134674 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.134689 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.134701 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.240158 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.240826 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.240848 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.240878 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.240894 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.327663 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"99725401c4b1592e05e50f2ad839428c57a29e5960b5d0c25e94b6619022434e"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.329345 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q7rfn" event={"ID":"da8fe0b9-424c-4361-b63a-5631e21f8fb4","Type":"ContainerStarted","Data":"c5fd9cbb3b32c06c1bfc0ed2d92907953ec0c6c3ef036edc50a8969c6d6585de"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.333912 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"ca2490da3a1d784febcc6515efd89108b485a746a359ece0585df00723f355b6"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.336318 4948 generic.go:334] "Generic (PLEG): container finished" podID="4170d85e-dba9-4cc0-8183-2b16aa4f43e7" containerID="00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441" exitCode=0 Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.336739 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerDied","Data":"00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.343495 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.348471 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.348514 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.348526 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.348544 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.348558 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.375579 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.401583 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.417351 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.431595 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86166768-c599-43d8-82b3-3f8752ade673\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.444744 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.452885 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.452943 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.452955 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.452975 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.452989 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.463441 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.484297 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81744a923b3e35e7b7416c7997a78d504b43adaf9cee467ddcceda8cfe6b33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.495156 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q7rfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da8fe0b9-424c-4361-b63a-5631e21f8fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hc52g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q7rfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.507089 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.521260 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.535496 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.556524 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.556554 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.556589 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.556604 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.556615 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.557398 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.573693 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.585361 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.603173 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81744a923b3e35e7b7416c7997a78d504b43adaf9cee467ddcceda8cfe6b33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.619620 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q7rfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da8fe0b9-424c-4361-b63a-5631e21f8fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hc52g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q7rfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.635767 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.651748 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba6c4514cc1e008ac36c8b64e24f3ab24d07e8152e5bf10a30702d74b4a2d71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.659080 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.659111 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.659121 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.659136 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.659146 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.669261 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.684072 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.696082 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.707836 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eaa6d228a485c6a98bde5ff43ee9e40255b060989a4be1f4c310355bc74607f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.717772 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.731583 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.745383 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.757433 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86166768-c599-43d8-82b3-3f8752ade673\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.760960 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.760984 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.760993 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.761007 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.761017 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.769174 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.788133 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.864348 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.864392 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.864403 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.864419 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.864431 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.932461 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.948013 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.967018 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.967094 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.967107 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.967124 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.967136 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:58Z","lastTransitionTime":"2025-12-04T17:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.969610 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:58 crc kubenswrapper[4948]: I1204 17:26:58.985635 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86166768-c599-43d8-82b3-3f8752ade673\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64223a9148a1c8f79827ae1ce87e2d89590c5f3e05814ae1e902998f57a4df78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0b243fdac67da80a5cd294da037f50dec0c809633227b5ba4b18a4a0efa7dce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc451bf6aff1bef126491693c95ad96b8d64e6c629587e4dce627cc11fb6ab5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.001854 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:58Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.019556 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4170d85e-dba9-4cc0-8183-2b16aa4f43e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00056702cac45afc73b309d3c8ab3676f1a3d617a9208b6b941c48474609a441\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86xzb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2gnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.035646 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da81744a923b3e35e7b7416c7997a78d504b43adaf9cee467ddcceda8cfe6b33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.045918 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q7rfn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da8fe0b9-424c-4361-b63a-5631e21f8fb4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hc52g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q7rfn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.063281 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.069014 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.069036 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.069066 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.069079 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.069089 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.075891 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.087451 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eaa6d228a485c6a98bde5ff43ee9e40255b060989a4be1f4c310355bc74607f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.098414 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.107128 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba6c4514cc1e008ac36c8b64e24f3ab24d07e8152e5bf10a30702d74b4a2d71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.119747 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.171986 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.172396 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.172546 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.172702 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.172839 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.275332 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.275369 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.275381 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.275397 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.275407 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.347429 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerStarted","Data":"aa14a2c3fb8a5aeb061ae77d3039e1cfe4963dab814f4f8687b99efa090a3dc2"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.349620 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.365585 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ql2z6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4014d19-9310-4326-81ae-dd5d03df6311\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba6c4514cc1e008ac36c8b64e24f3ab24d07e8152e5bf10a30702d74b4a2d71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bslgt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ql2z6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.377391 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.377432 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.377453 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.377469 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.377481 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.377559 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lz7z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cda64a2b-9444-49d3-bee6-21e8c2bae502\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xpt4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lz7z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.400370 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8149892b-eb94-4d2d-99b3-cebf34efa32a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mph6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4cnmm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.417880 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5df3161c-11e8-460d-9c77-68d23acc9609\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T17:26:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T17:26:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.432033 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eaa6d228a485c6a98bde5ff43ee9e40255b060989a4be1f4c310355bc74607f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.446017 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:52Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rhh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T17:26:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hfvn4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.459115 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:59Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99725401c4b1592e05e50f2ad839428c57a29e5960b5d0c25e94b6619022434e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af1fde01f3f8f1cb71786aab9558135bf1074bbdc85e280dd3028e0b2c7d5350\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T17:26:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.470762 4948 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T17:26:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T17:26:59Z is after 2025-08-24T17:21:41Z" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.482006 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.482060 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.482073 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.482089 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.482099 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.575859 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=5.575826912 podStartE2EDuration="5.575826912s" podCreationTimestamp="2025-12-04 17:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:26:59.575670027 +0000 UTC m=+30.936744429" watchObservedRunningTime="2025-12-04 17:26:59.575826912 +0000 UTC m=+30.936901354" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.585269 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.585322 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.585339 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.585362 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.585382 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.654630 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=9.654607651 podStartE2EDuration="9.654607651s" podCreationTimestamp="2025-12-04 17:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:26:59.653557644 +0000 UTC m=+31.014632046" watchObservedRunningTime="2025-12-04 17:26:59.654607651 +0000 UTC m=+31.015682053" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.689305 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podStartSLOduration=7.689286239 podStartE2EDuration="7.689286239s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:26:59.688443127 +0000 UTC m=+31.049517529" watchObservedRunningTime="2025-12-04 17:26:59.689286239 +0000 UTC m=+31.050360641" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.691051 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.691096 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.691104 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.691118 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.691128 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.699776 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ql2z6" podStartSLOduration=8.69976126 podStartE2EDuration="8.69976126s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:26:59.699320179 +0000 UTC m=+31.060394601" watchObservedRunningTime="2025-12-04 17:26:59.69976126 +0000 UTC m=+31.060835662" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.715338 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lz7z7" podStartSLOduration=7.715317713 podStartE2EDuration="7.715317713s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:26:59.714220385 +0000 UTC m=+31.075294797" watchObservedRunningTime="2025-12-04 17:26:59.715317713 +0000 UTC m=+31.076392115" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.722225 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln"] Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.722680 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.724226 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.724260 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.764832 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-t6lr5"] Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.765241 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:26:59 crc kubenswrapper[4948]: E1204 17:26:59.765292 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.768888 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24089a47-fd9e-491e-9287-792b784e3752-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.768912 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24089a47-fd9e-491e-9287-792b784e3752-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.768944 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24089a47-fd9e-491e-9287-792b784e3752-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.768974 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7tz\" (UniqueName: \"kubernetes.io/projected/24089a47-fd9e-491e-9287-792b784e3752-kube-api-access-pq7tz\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.786733 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q7rfn" podStartSLOduration=8.786719082 podStartE2EDuration="8.786719082s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:26:59.779760861 +0000 UTC m=+31.140835263" watchObservedRunningTime="2025-12-04 17:26:59.786719082 +0000 UTC m=+31.147793484" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.793687 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.793876 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.793964 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.794075 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.794180 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.870308 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdwzn\" (UniqueName: \"kubernetes.io/projected/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-kube-api-access-bdwzn\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.870364 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7tz\" (UniqueName: \"kubernetes.io/projected/24089a47-fd9e-491e-9287-792b784e3752-kube-api-access-pq7tz\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.870399 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24089a47-fd9e-491e-9287-792b784e3752-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.870418 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24089a47-fd9e-491e-9287-792b784e3752-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.870441 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24089a47-fd9e-491e-9287-792b784e3752-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.870459 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.871246 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24089a47-fd9e-491e-9287-792b784e3752-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.871271 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24089a47-fd9e-491e-9287-792b784e3752-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.880809 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24089a47-fd9e-491e-9287-792b784e3752-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.885645 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7tz\" (UniqueName: \"kubernetes.io/projected/24089a47-fd9e-491e-9287-792b784e3752-kube-api-access-pq7tz\") pod \"ovnkube-control-plane-749d76644c-r2gln\" (UID: \"24089a47-fd9e-491e-9287-792b784e3752\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.896874 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.896906 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.896917 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.896932 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.896943 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:26:59Z","lastTransitionTime":"2025-12-04T17:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.913468 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.913484 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.913488 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:26:59 crc kubenswrapper[4948]: E1204 17:26:59.913609 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:26:59 crc kubenswrapper[4948]: E1204 17:26:59.913790 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:26:59 crc kubenswrapper[4948]: E1204 17:26:59.913941 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.971087 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdwzn\" (UniqueName: \"kubernetes.io/projected/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-kube-api-access-bdwzn\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:26:59 crc kubenswrapper[4948]: I1204 17:26:59.971185 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:26:59 crc kubenswrapper[4948]: E1204 17:26:59.971297 4948 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:26:59 crc kubenswrapper[4948]: E1204 17:26:59.971355 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs podName:f47382b4-4f12-471b-92aa-5d4ccb9c0bf0 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:00.471340431 +0000 UTC m=+31.832414833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs") pod "network-metrics-daemon-t6lr5" (UID: "f47382b4-4f12-471b-92aa-5d4ccb9c0bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.000434 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.000491 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.000508 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.000531 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.000549 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.001248 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdwzn\" (UniqueName: \"kubernetes.io/projected/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-kube-api-access-bdwzn\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.039955 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" Dec 04 17:27:00 crc kubenswrapper[4948]: W1204 17:27:00.060020 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24089a47_fd9e_491e_9287_792b784e3752.slice/crio-6f8cc69fa31a395f45cb1f6e4a7e4936589b6ce2ada93248df915f1ae59480f9 WatchSource:0}: Error finding container 6f8cc69fa31a395f45cb1f6e4a7e4936589b6ce2ada93248df915f1ae59480f9: Status 404 returned error can't find the container with id 6f8cc69fa31a395f45cb1f6e4a7e4936589b6ce2ada93248df915f1ae59480f9 Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.103802 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.103858 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.103877 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.103902 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.103921 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.206528 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.206918 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.206932 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.206954 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.206966 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.308813 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.308851 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.308859 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.308873 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.308882 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.354853 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" event={"ID":"24089a47-fd9e-491e-9287-792b784e3752","Type":"ContainerStarted","Data":"6f8cc69fa31a395f45cb1f6e4a7e4936589b6ce2ada93248df915f1ae59480f9"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.357650 4948 generic.go:334] "Generic (PLEG): container finished" podID="4170d85e-dba9-4cc0-8183-2b16aa4f43e7" containerID="aa14a2c3fb8a5aeb061ae77d3039e1cfe4963dab814f4f8687b99efa090a3dc2" exitCode=0 Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.357716 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerDied","Data":"aa14a2c3fb8a5aeb061ae77d3039e1cfe4963dab814f4f8687b99efa090a3dc2"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.364212 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.364257 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.413200 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.413264 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.413287 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.413318 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.413340 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.476348 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:00 crc kubenswrapper[4948]: E1204 17:27:00.476524 4948 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:00 crc kubenswrapper[4948]: E1204 17:27:00.476591 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs podName:f47382b4-4f12-471b-92aa-5d4ccb9c0bf0 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:01.476570282 +0000 UTC m=+32.837644704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs") pod "network-metrics-daemon-t6lr5" (UID: "f47382b4-4f12-471b-92aa-5d4ccb9c0bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.516630 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.516673 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.516683 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.516704 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.516714 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.620122 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.620166 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.620186 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.620204 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.620217 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.722429 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.722473 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.722485 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.722502 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.722515 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.824749 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.824969 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.825069 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.825155 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.825236 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.928221 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.928250 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.928259 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.928271 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:00 crc kubenswrapper[4948]: I1204 17:27:00.928281 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:00Z","lastTransitionTime":"2025-12-04T17:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.032082 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.032127 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.032143 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.032166 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.032179 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.135263 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.136109 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.136306 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.136453 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.136601 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.240118 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.240183 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.240198 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.240223 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.240236 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.343957 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.344028 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.344080 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.344112 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.344132 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.370698 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" event={"ID":"24089a47-fd9e-491e-9287-792b784e3752","Type":"ContainerStarted","Data":"cae672b8399bc3ba0d339741908dd98f47f520331d1ad09d5603bb1c26e497df"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.370772 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" event={"ID":"24089a47-fd9e-491e-9287-792b784e3752","Type":"ContainerStarted","Data":"5ef82137c5770baf0251edc41b11c4b0fd6138f7f33631dd97b7fe7641d5dd28"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.372690 4948 generic.go:334] "Generic (PLEG): container finished" podID="4170d85e-dba9-4cc0-8183-2b16aa4f43e7" containerID="e9d5a82747f75d4688d34d6d912c446f9bf13ea597cd939f3c2721f3679975bc" exitCode=0 Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.372758 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerDied","Data":"e9d5a82747f75d4688d34d6d912c446f9bf13ea597cd939f3c2721f3679975bc"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.383998 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.384083 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.446786 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.446849 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.446862 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.446883 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.446898 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.488896 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:01 crc kubenswrapper[4948]: E1204 17:27:01.489290 4948 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:01 crc kubenswrapper[4948]: E1204 17:27:01.489390 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs podName:f47382b4-4f12-471b-92aa-5d4ccb9c0bf0 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:03.489366493 +0000 UTC m=+34.850440895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs") pod "network-metrics-daemon-t6lr5" (UID: "f47382b4-4f12-471b-92aa-5d4ccb9c0bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.550763 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.550824 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.550834 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.550856 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.550871 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.611471 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r2gln" podStartSLOduration=9.611363401 podStartE2EDuration="9.611363401s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:01.391912889 +0000 UTC m=+32.752987291" watchObservedRunningTime="2025-12-04 17:27:01.611363401 +0000 UTC m=+32.972437833" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.653564 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.653619 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.653638 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.653664 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.653681 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.756107 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.756143 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.756152 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.756169 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.756179 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.859631 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.859678 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.859696 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.859722 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.859740 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.913511 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:01 crc kubenswrapper[4948]: E1204 17:27:01.913675 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.913852 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.913874 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.913913 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:01 crc kubenswrapper[4948]: E1204 17:27:01.913912 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:27:01 crc kubenswrapper[4948]: E1204 17:27:01.913961 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:27:01 crc kubenswrapper[4948]: E1204 17:27:01.914080 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.963347 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.963395 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.963406 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.963421 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:01 crc kubenswrapper[4948]: I1204 17:27:01.963432 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:01Z","lastTransitionTime":"2025-12-04T17:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.066974 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.067012 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.067032 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.067066 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.067077 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.170197 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.170267 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.170287 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.170312 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.170328 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.273163 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.273212 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.273230 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.273256 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.273274 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.376858 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.376923 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.376939 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.376967 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.377002 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.479539 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.479597 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.479610 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.479635 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.479653 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.582647 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.582703 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.582720 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.582743 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.582757 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.685647 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.685701 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.685717 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.685741 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.685757 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.788978 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.789037 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.789068 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.789086 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.789097 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.892830 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.892910 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.892980 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.893015 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.893038 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.996389 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.996429 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.996444 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.996475 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:02 crc kubenswrapper[4948]: I1204 17:27:02.996491 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:02Z","lastTransitionTime":"2025-12-04T17:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.099343 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.099783 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.099805 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.099835 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.099859 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.203213 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.204228 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.204279 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.204316 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.204348 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.307287 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.307343 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.307354 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.307372 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.307383 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.393617 4948 generic.go:334] "Generic (PLEG): container finished" podID="4170d85e-dba9-4cc0-8183-2b16aa4f43e7" containerID="c2b4bb7b32fc18a6456e106568e83f45d771161fe8254a2d0ca6185d9497d760" exitCode=0 Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.393691 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerDied","Data":"c2b4bb7b32fc18a6456e106568e83f45d771161fe8254a2d0ca6185d9497d760"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.399593 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.409955 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.410013 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.410029 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.410105 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.410126 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.513504 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.513608 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.513656 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.513670 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.513700 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.513717 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: E1204 17:27:03.513982 4948 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:03 crc kubenswrapper[4948]: E1204 17:27:03.514133 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs podName:f47382b4-4f12-471b-92aa-5d4ccb9c0bf0 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:07.514099472 +0000 UTC m=+38.875174074 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs") pod "network-metrics-daemon-t6lr5" (UID: "f47382b4-4f12-471b-92aa-5d4ccb9c0bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.616393 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.616455 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.616468 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.616490 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.616516 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.721607 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.721687 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.721705 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.721732 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.721748 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.824521 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.824572 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.824585 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.824606 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.824619 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.913107 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:03 crc kubenswrapper[4948]: E1204 17:27:03.913427 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.913825 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:03 crc kubenswrapper[4948]: E1204 17:27:03.914352 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.914438 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.914620 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:03 crc kubenswrapper[4948]: E1204 17:27:03.915063 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:27:03 crc kubenswrapper[4948]: E1204 17:27:03.915346 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.930490 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.930542 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.930558 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.930581 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:03 crc kubenswrapper[4948]: I1204 17:27:03.930595 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:03Z","lastTransitionTime":"2025-12-04T17:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.034391 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.034454 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.034481 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.034502 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.034518 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.139412 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.139486 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.139505 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.139531 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.139549 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.242710 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.242780 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.242798 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.242825 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.242844 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.346150 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.346213 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.346225 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.346247 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.346259 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.407903 4948 generic.go:334] "Generic (PLEG): container finished" podID="4170d85e-dba9-4cc0-8183-2b16aa4f43e7" containerID="7a49b5e0d3b7656350e942ec8dcb694f2b1862a09fb87133dd07f2b882b57cb2" exitCode=0 Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.408002 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerDied","Data":"7a49b5e0d3b7656350e942ec8dcb694f2b1862a09fb87133dd07f2b882b57cb2"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.449164 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.449211 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.449222 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.449242 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.449253 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.552397 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.552504 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.552520 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.552546 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.552567 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.655999 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.656083 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.656098 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.656121 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.656135 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.759241 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.759729 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.759911 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.760107 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.760288 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.863405 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.863481 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.863496 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.863517 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.863531 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.966143 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.966198 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.966211 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.966230 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:04 crc kubenswrapper[4948]: I1204 17:27:04.966244 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:04Z","lastTransitionTime":"2025-12-04T17:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.069199 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.069249 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.069264 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.069286 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.069299 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.173158 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.173218 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.173238 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.173266 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.173286 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.276833 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.276871 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.276884 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.276899 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.276910 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.380175 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.380233 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.380245 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.380351 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.380374 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.484637 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.484712 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.484724 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.484747 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.484761 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.589131 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.589614 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.589626 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.589647 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.589660 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.692317 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.692360 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.692371 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.692393 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.692407 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.796269 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.796320 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.796333 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.796356 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.796372 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.899964 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.900064 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.900087 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.900117 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.900140 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:05Z","lastTransitionTime":"2025-12-04T17:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.913228 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.913375 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.914033 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.914125 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.914183 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.914233 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.914294 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.914352 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.956374 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.956580 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:05 crc kubenswrapper[4948]: I1204 17:27:05.956663 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.956830 4948 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.956905 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.956881505 +0000 UTC m=+53.317955947 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.957583 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.957558873 +0000 UTC m=+53.318633305 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.957681 4948 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:27:05 crc kubenswrapper[4948]: E1204 17:27:05.957753 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.957731787 +0000 UTC m=+53.318806229 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.014507 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.014575 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.014591 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.014615 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.014636 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.057663 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.057737 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:06 crc kubenswrapper[4948]: E1204 17:27:06.057945 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:27:06 crc kubenswrapper[4948]: E1204 17:27:06.057969 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:27:06 crc kubenswrapper[4948]: E1204 17:27:06.057983 4948 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:27:06 crc kubenswrapper[4948]: E1204 17:27:06.058076 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.058040674 +0000 UTC m=+53.419115076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:27:06 crc kubenswrapper[4948]: E1204 17:27:06.058367 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 17:27:06 crc kubenswrapper[4948]: E1204 17:27:06.058953 4948 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 17:27:06 crc kubenswrapper[4948]: E1204 17:27:06.058992 4948 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:27:06 crc kubenswrapper[4948]: E1204 17:27:06.059369 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.059326077 +0000 UTC m=+53.420400649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.118122 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.118154 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.118162 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.118176 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.118185 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.220989 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.221022 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.221032 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.221078 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.221089 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.324162 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.324238 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.324264 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.324296 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.324317 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.424298 4948 generic.go:334] "Generic (PLEG): container finished" podID="4170d85e-dba9-4cc0-8183-2b16aa4f43e7" containerID="67abe9ba633090340e4ab5d761bd21481fbc49af5c2f4562d5fc09a6c62a449b" exitCode=0 Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.424381 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerDied","Data":"67abe9ba633090340e4ab5d761bd21481fbc49af5c2f4562d5fc09a6c62a449b"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.427403 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.427427 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.427437 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.427453 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.427464 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.430917 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerStarted","Data":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.529848 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.529894 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.529906 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.529925 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.529936 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.632930 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.632974 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.632989 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.633008 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.633021 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.736619 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.736677 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.736689 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.736710 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.736723 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.839743 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.839805 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.839828 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.839858 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.839879 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.942610 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.942677 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.942697 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.942722 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.942742 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.954629 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.954696 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.954714 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.954736 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 17:27:06 crc kubenswrapper[4948]: I1204 17:27:06.954752 4948 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T17:27:06Z","lastTransitionTime":"2025-12-04T17:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.014852 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt"] Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.015594 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.018273 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.019306 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.019493 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.019528 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.069668 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e39190c-d3a0-4bab-9433-0d058694d508-service-ca\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.069921 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4e39190c-d3a0-4bab-9433-0d058694d508-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.070023 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e39190c-d3a0-4bab-9433-0d058694d508-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.070166 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e39190c-d3a0-4bab-9433-0d058694d508-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.070319 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4e39190c-d3a0-4bab-9433-0d058694d508-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.172180 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e39190c-d3a0-4bab-9433-0d058694d508-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.172240 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4e39190c-d3a0-4bab-9433-0d058694d508-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.172287 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e39190c-d3a0-4bab-9433-0d058694d508-service-ca\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.172310 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4e39190c-d3a0-4bab-9433-0d058694d508-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.172337 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e39190c-d3a0-4bab-9433-0d058694d508-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.172420 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4e39190c-d3a0-4bab-9433-0d058694d508-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.172564 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4e39190c-d3a0-4bab-9433-0d058694d508-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.173779 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e39190c-d3a0-4bab-9433-0d058694d508-service-ca\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.179940 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e39190c-d3a0-4bab-9433-0d058694d508-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.202797 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e39190c-d3a0-4bab-9433-0d058694d508-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-56zxt\" (UID: \"4e39190c-d3a0-4bab-9433-0d058694d508\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.344733 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" Dec 04 17:27:07 crc kubenswrapper[4948]: W1204 17:27:07.361673 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e39190c_d3a0_4bab_9433_0d058694d508.slice/crio-3a45e5900cafdde26f946cd6abb18395b3a3d880df643ecb422a675559605aa8 WatchSource:0}: Error finding container 3a45e5900cafdde26f946cd6abb18395b3a3d880df643ecb422a675559605aa8: Status 404 returned error can't find the container with id 3a45e5900cafdde26f946cd6abb18395b3a3d880df643ecb422a675559605aa8 Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.436436 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" event={"ID":"4e39190c-d3a0-4bab-9433-0d058694d508","Type":"ContainerStarted","Data":"3a45e5900cafdde26f946cd6abb18395b3a3d880df643ecb422a675559605aa8"} Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.443538 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" event={"ID":"4170d85e-dba9-4cc0-8183-2b16aa4f43e7","Type":"ContainerStarted","Data":"91a72e88cf1eedecfdf0db1be5a426458a075ab0012e5ed6a650f68f054c2864"} Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.443965 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.444162 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.475482 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podStartSLOduration=15.47546243 podStartE2EDuration="15.47546243s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:07.473871249 +0000 UTC m=+38.834945661" watchObservedRunningTime="2025-12-04 17:27:07.47546243 +0000 UTC m=+38.836536842" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.577589 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:07 crc kubenswrapper[4948]: E1204 17:27:07.577758 4948 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:07 crc kubenswrapper[4948]: E1204 17:27:07.577816 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs podName:f47382b4-4f12-471b-92aa-5d4ccb9c0bf0 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:15.577798539 +0000 UTC m=+46.938872951 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs") pod "network-metrics-daemon-t6lr5" (UID: "f47382b4-4f12-471b-92aa-5d4ccb9c0bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.825256 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.826242 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.913001 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.913024 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.913106 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:07 crc kubenswrapper[4948]: E1204 17:27:07.913205 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:27:07 crc kubenswrapper[4948]: I1204 17:27:07.913240 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:07 crc kubenswrapper[4948]: E1204 17:27:07.913428 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:27:07 crc kubenswrapper[4948]: E1204 17:27:07.913536 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:27:07 crc kubenswrapper[4948]: E1204 17:27:07.913662 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:27:08 crc kubenswrapper[4948]: I1204 17:27:08.446616 4948 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 17:27:08 crc kubenswrapper[4948]: I1204 17:27:08.467497 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2gnsr" podStartSLOduration=16.467478183 podStartE2EDuration="16.467478183s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:08.467447682 +0000 UTC m=+39.828522084" watchObservedRunningTime="2025-12-04 17:27:08.467478183 +0000 UTC m=+39.828552585" Dec 04 17:27:09 crc kubenswrapper[4948]: I1204 17:27:09.448781 4948 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 17:27:09 crc kubenswrapper[4948]: I1204 17:27:09.912938 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:09 crc kubenswrapper[4948]: E1204 17:27:09.913145 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:27:09 crc kubenswrapper[4948]: I1204 17:27:09.913649 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:09 crc kubenswrapper[4948]: E1204 17:27:09.913714 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:27:09 crc kubenswrapper[4948]: I1204 17:27:09.913757 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:09 crc kubenswrapper[4948]: E1204 17:27:09.913798 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:27:09 crc kubenswrapper[4948]: I1204 17:27:09.913840 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:09 crc kubenswrapper[4948]: E1204 17:27:09.913884 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:27:10 crc kubenswrapper[4948]: I1204 17:27:10.455341 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" event={"ID":"4e39190c-d3a0-4bab-9433-0d058694d508","Type":"ContainerStarted","Data":"c5300076cbd2344ffe19ff2782e4d16cc556fa540dbeb4ea19427dc944917405"} Dec 04 17:27:10 crc kubenswrapper[4948]: I1204 17:27:10.511265 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t6lr5"] Dec 04 17:27:10 crc kubenswrapper[4948]: I1204 17:27:10.511444 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:10 crc kubenswrapper[4948]: E1204 17:27:10.511591 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:27:11 crc kubenswrapper[4948]: I1204 17:27:11.913731 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:11 crc kubenswrapper[4948]: I1204 17:27:11.913790 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:11 crc kubenswrapper[4948]: I1204 17:27:11.913827 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:11 crc kubenswrapper[4948]: E1204 17:27:11.913881 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:27:11 crc kubenswrapper[4948]: E1204 17:27:11.913995 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:27:11 crc kubenswrapper[4948]: E1204 17:27:11.914152 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:27:12 crc kubenswrapper[4948]: I1204 17:27:12.912918 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:12 crc kubenswrapper[4948]: E1204 17:27:12.913194 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:27:13 crc kubenswrapper[4948]: I1204 17:27:13.857914 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:27:13 crc kubenswrapper[4948]: I1204 17:27:13.858454 4948 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 17:27:13 crc kubenswrapper[4948]: I1204 17:27:13.879644 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:27:13 crc kubenswrapper[4948]: I1204 17:27:13.913317 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:13 crc kubenswrapper[4948]: I1204 17:27:13.913325 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:13 crc kubenswrapper[4948]: I1204 17:27:13.913411 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:13 crc kubenswrapper[4948]: E1204 17:27:13.913563 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 17:27:13 crc kubenswrapper[4948]: E1204 17:27:13.914144 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 17:27:13 crc kubenswrapper[4948]: E1204 17:27:13.914254 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 17:27:13 crc kubenswrapper[4948]: I1204 17:27:13.925938 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-56zxt" podStartSLOduration=22.925910779 podStartE2EDuration="22.925910779s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:13.485912487 +0000 UTC m=+44.846986939" watchObservedRunningTime="2025-12-04 17:27:13.925910779 +0000 UTC m=+45.286985211" Dec 04 17:27:14 crc kubenswrapper[4948]: I1204 17:27:14.912765 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:14 crc kubenswrapper[4948]: E1204 17:27:14.913236 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t6lr5" podUID="f47382b4-4f12-471b-92aa-5d4ccb9c0bf0" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.622268 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:15 crc kubenswrapper[4948]: E1204 17:27:15.623472 4948 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:15 crc kubenswrapper[4948]: E1204 17:27:15.623720 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs podName:f47382b4-4f12-471b-92aa-5d4ccb9c0bf0 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:31.623693664 +0000 UTC m=+62.984768096 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs") pod "network-metrics-daemon-t6lr5" (UID: "f47382b4-4f12-471b-92aa-5d4ccb9c0bf0") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.754377 4948 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.754553 4948 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.790183 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-trshb"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.791038 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.798833 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xqmtt"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.799484 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.799818 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-m5k2z"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.800284 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.800899 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.801492 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.802260 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 17:27:15 crc kubenswrapper[4948]: W1204 17:27:15.802642 4948 reflector.go:561] object-"openshift-console"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Dec 04 17:27:15 crc kubenswrapper[4948]: E1204 17:27:15.802749 4948 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.802774 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.802901 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.802982 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.803089 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.802802 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.803436 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.803911 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.805020 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.805302 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.805471 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.805906 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xfvmf"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.806233 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7jtdw"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.806322 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.806540 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.806770 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7jtdw" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.813563 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rz5j4"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.814354 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.817357 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.817537 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.817691 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.817805 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.817823 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.821231 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.824256 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.825247 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.825537 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.826385 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.827809 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.827829 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771c1e0f-69a0-4bf2-8345-37ed755de8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.827896 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1b9652-58ed-4708-8cae-58cf5b66d439-serving-cert\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837001 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837313 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837541 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837567 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837643 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837721 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837750 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-serving-cert\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837778 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09e20532-f709-4854-82c2-7b84e2d62950-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837800 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-serving-cert\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837828 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-oauth-config\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837866 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.837867 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6trm\" (UniqueName: \"kubernetes.io/projected/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-kube-api-access-j6trm\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838178 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-encryption-config\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838271 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtp4d\" (UniqueName: \"kubernetes.io/projected/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-kube-api-access-dtp4d\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838348 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-config\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838522 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-trusted-ca-bundle\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838596 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-oauth-serving-cert\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838661 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838732 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357c70a4-c799-43ba-8d28-ca99269d41fc-node-pullsecrets\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838804 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f1b9652-58ed-4708-8cae-58cf5b66d439-trusted-ca\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838878 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838953 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-config\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.839036 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-etcd-serving-ca\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.839136 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7hvm\" (UniqueName: \"kubernetes.io/projected/357c70a4-c799-43ba-8d28-ca99269d41fc-kube-api-access-l7hvm\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.839206 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1b9652-58ed-4708-8cae-58cf5b66d439-config\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.839276 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e20532-f709-4854-82c2-7b84e2d62950-config\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.839344 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqh5\" (UniqueName: \"kubernetes.io/projected/09e20532-f709-4854-82c2-7b84e2d62950-kube-api-access-4qqh5\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.839415 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6gb\" (UniqueName: \"kubernetes.io/projected/c3556602-2a66-48fb-a187-85849f5c08e4-kube-api-access-2g6gb\") pod \"downloads-7954f5f757-7jtdw\" (UID: \"c3556602-2a66-48fb-a187-85849f5c08e4\") " pod="openshift-console/downloads-7954f5f757-7jtdw" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.838197 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.839674 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnrt6\" (UniqueName: \"kubernetes.io/projected/771c1e0f-69a0-4bf2-8345-37ed755de8ff-kube-api-access-lnrt6\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.840037 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.840488 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.840868 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841132 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-etcd-client\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841188 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-audit\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841204 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/357c70a4-c799-43ba-8d28-ca99269d41fc-audit-dir\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841230 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-images\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841246 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-image-import-ca\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841265 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57q2\" (UniqueName: \"kubernetes.io/projected/6f1b9652-58ed-4708-8cae-58cf5b66d439-kube-api-access-x57q2\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841279 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-service-ca\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841294 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-config\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841313 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-config\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841456 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.841801 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g7mvh"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.842200 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.842463 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.844221 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.845763 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.845987 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.846401 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.846689 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.846752 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-f7tp6"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.847448 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.847582 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.848242 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4qwlk"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.848555 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5275t"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.848918 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.849316 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.849369 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.849502 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.849741 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.850849 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-595fv"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.851392 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.853228 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.854904 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.862850 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863146 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863293 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863584 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863742 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.864267 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.864724 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.864907 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863203 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863891 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863968 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.864015 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.864078 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.865648 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.865758 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863305 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.866007 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.866020 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.866127 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.866166 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.863933 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.866678 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-zgswc"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.866832 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.867003 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d6slp"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.867004 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.867100 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.868134 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.869864 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.869968 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.870206 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.870295 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.870348 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.869912 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.870528 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.870695 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.870895 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.871102 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.871112 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.871302 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.871416 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.871428 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.871443 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.871557 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.871558 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.872219 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.873388 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.873872 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.891105 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.892674 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.911441 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.911909 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.912282 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.912573 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.912717 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2tl2h"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.913110 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.913447 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.913576 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.914124 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.914423 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.914670 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.914995 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.920290 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.921774 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.922160 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.922487 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.922720 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.923312 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.923438 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.923472 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.923495 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.925944 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.926129 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.926350 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.925946 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.926402 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.926936 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.926980 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.927236 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.927303 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.927405 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.927583 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.927790 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.927921 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.928187 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.928234 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.928323 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.928479 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.929299 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.929765 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.933519 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.934245 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.934448 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.934595 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hbqk5"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.934872 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.935236 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.935246 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.935399 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.941379 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.941394 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.941899 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-audit\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.941930 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.941962 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-images\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.941978 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-image-import-ca\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.941998 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/357c70a4-c799-43ba-8d28-ca99269d41fc-audit-dir\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942025 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-service-ca\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942059 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-config\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942075 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x57q2\" (UniqueName: \"kubernetes.io/projected/6f1b9652-58ed-4708-8cae-58cf5b66d439-kube-api-access-x57q2\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942138 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942155 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942172 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942191 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942209 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjt2p\" (UniqueName: \"kubernetes.io/projected/87c3b0a5-59be-438e-a074-b3f5b154039e-kube-api-access-cjt2p\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942224 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-config\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942231 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942239 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-audit-policies\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942256 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771c1e0f-69a0-4bf2-8345-37ed755de8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942271 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1b9652-58ed-4708-8cae-58cf5b66d439-serving-cert\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942286 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87c3b0a5-59be-438e-a074-b3f5b154039e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942301 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942317 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942332 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fbe65a5-28e7-40db-85f7-66d00806dcbe-trusted-ca\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942347 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09e20532-f709-4854-82c2-7b84e2d62950-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942362 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-serving-cert\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942379 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942396 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hls7q\" (UniqueName: \"kubernetes.io/projected/09680d2b-7d6e-4dcd-bf38-d4642fe27ac2-kube-api-access-hls7q\") pod \"dns-operator-744455d44c-f7tp6\" (UID: \"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2\") " pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942412 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-serving-cert\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942432 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-oauth-config\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942450 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-ca\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942466 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/de05452b-7cdf-44da-a351-b21ba3691f41-signing-key\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942481 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62821e25-9412-4650-a9e0-34f4fe49656b-audit-dir\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942496 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6trm\" (UniqueName: \"kubernetes.io/projected/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-kube-api-access-j6trm\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942512 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-encryption-config\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942526 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-config\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942544 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rwb\" (UniqueName: \"kubernetes.io/projected/62821e25-9412-4650-a9e0-34f4fe49656b-kube-api-access-b5rwb\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942561 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6fzp\" (UniqueName: \"kubernetes.io/projected/4fbe65a5-28e7-40db-85f7-66d00806dcbe-kube-api-access-w6fzp\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942580 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtp4d\" (UniqueName: \"kubernetes.io/projected/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-kube-api-access-dtp4d\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942601 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942620 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-config\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942639 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c3b0a5-59be-438e-a074-b3f5b154039e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942670 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942693 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-trusted-ca-bundle\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942713 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-oauth-serving-cert\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942734 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942755 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqhht\" (UniqueName: \"kubernetes.io/projected/c0514d31-211c-4b78-b2a3-8536fe75604d-kube-api-access-xqhht\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942775 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f1b9652-58ed-4708-8cae-58cf5b66d439-trusted-ca\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942795 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357c70a4-c799-43ba-8d28-ca99269d41fc-node-pullsecrets\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942832 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942852 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942876 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0514d31-211c-4b78-b2a3-8536fe75604d-serving-cert\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942898 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942918 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwpfp\" (UniqueName: \"kubernetes.io/projected/de05452b-7cdf-44da-a351-b21ba3691f41-kube-api-access-nwpfp\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942942 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-config\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942960 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-etcd-serving-ca\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942982 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7hvm\" (UniqueName: \"kubernetes.io/projected/357c70a4-c799-43ba-8d28-ca99269d41fc-kube-api-access-l7hvm\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.942997 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/de05452b-7cdf-44da-a351-b21ba3691f41-signing-cabundle\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943011 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09680d2b-7d6e-4dcd-bf38-d4642fe27ac2-metrics-tls\") pod \"dns-operator-744455d44c-f7tp6\" (UID: \"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2\") " pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943027 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1b9652-58ed-4708-8cae-58cf5b66d439-config\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943058 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-client\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943074 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4005cfa7-7eda-43d9-ba7f-fe06d42c82d2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hg692\" (UID: \"4005cfa7-7eda-43d9-ba7f-fe06d42c82d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943090 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e20532-f709-4854-82c2-7b84e2d62950-config\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943105 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-service-ca\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943122 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqh5\" (UniqueName: \"kubernetes.io/projected/09e20532-f709-4854-82c2-7b84e2d62950-kube-api-access-4qqh5\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943125 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fltqd"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943137 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6gb\" (UniqueName: \"kubernetes.io/projected/c3556602-2a66-48fb-a187-85849f5c08e4-kube-api-access-2g6gb\") pod \"downloads-7954f5f757-7jtdw\" (UID: \"c3556602-2a66-48fb-a187-85849f5c08e4\") " pod="openshift-console/downloads-7954f5f757-7jtdw" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943161 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fbe65a5-28e7-40db-85f7-66d00806dcbe-metrics-tls\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943175 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fbe65a5-28e7-40db-85f7-66d00806dcbe-bound-sa-token\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943190 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnrt6\" (UniqueName: \"kubernetes.io/projected/771c1e0f-69a0-4bf2-8345-37ed755de8ff-kube-api-access-lnrt6\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943212 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-etcd-client\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943232 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckfz\" (UniqueName: \"kubernetes.io/projected/4005cfa7-7eda-43d9-ba7f-fe06d42c82d2-kube-api-access-vckfz\") pod \"control-plane-machine-set-operator-78cbb6b69f-hg692\" (UID: \"4005cfa7-7eda-43d9-ba7f-fe06d42c82d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943339 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-config\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943840 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-audit\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.943884 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.945089 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-oauth-serving-cert\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.945247 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.945825 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.946942 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f1b9652-58ed-4708-8cae-58cf5b66d439-config\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.947009 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-client-ca\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.956480 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-oauth-config\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.957234 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-images\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.957618 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09e20532-f709-4854-82c2-7b84e2d62950-config\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.957912 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f1b9652-58ed-4708-8cae-58cf5b66d439-trusted-ca\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.960738 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/357c70a4-c799-43ba-8d28-ca99269d41fc-audit-dir\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.961098 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-serving-cert\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.962172 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-image-import-ca\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.962945 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-service-ca\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.963020 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-config\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.964241 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-config\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.964341 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-config\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.964402 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/357c70a4-c799-43ba-8d28-ca99269d41fc-node-pullsecrets\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.964498 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.964768 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-etcd-serving-ca\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.965017 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-serving-cert\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.965099 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.965458 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f1b9652-58ed-4708-8cae-58cf5b66d439-serving-cert\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.965853 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-trusted-ca-bundle\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.966537 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357c70a4-c799-43ba-8d28-ca99269d41fc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.966613 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-etcd-client\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.984559 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771c1e0f-69a0-4bf2-8345-37ed755de8ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.985062 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.985493 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.985905 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.973797 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/357c70a4-c799-43ba-8d28-ca99269d41fc-encryption-config\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.986457 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09e20532-f709-4854-82c2-7b84e2d62950-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.986498 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.987494 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.987636 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.987840 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.989087 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.989249 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.989718 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-trshb"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.990105 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.990475 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tqbdj"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.990840 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.991219 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.991313 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-m5k2z"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.992171 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-h58bm"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.993437 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8k5hb"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.994403 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.994412 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.994748 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-595fv"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.994803 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8k5hb" Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.995060 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g7mvh"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.996086 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xqmtt"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.996916 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.997826 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.998675 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-f7tp6"] Dec 04 17:27:15 crc kubenswrapper[4948]: I1204 17:27:15.999568 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.000711 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.001376 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7jtdw"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.002333 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4qwlk"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.003390 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.004228 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xfvmf"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.005697 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rz5j4"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.006661 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.008457 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-xxt7h"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.009025 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.009500 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.010379 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.011619 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.012346 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tqbdj"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.013420 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.013495 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.014517 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2tl2h"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.015292 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.016206 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.017093 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fltqd"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.018001 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.018887 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d6slp"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.019780 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.020630 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hbqk5"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.021558 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5275t"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.022440 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.023292 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tsgwn"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.024147 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.024232 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.024969 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8k5hb"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.028067 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tsgwn"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.029598 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.031887 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.033826 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.035650 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.037205 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b"] Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.043697 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09680d2b-7d6e-4dcd-bf38-d4642fe27ac2-metrics-tls\") pod \"dns-operator-744455d44c-f7tp6\" (UID: \"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2\") " pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.043808 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/de05452b-7cdf-44da-a351-b21ba3691f41-signing-cabundle\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.043885 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-client\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.043965 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4005cfa7-7eda-43d9-ba7f-fe06d42c82d2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hg692\" (UID: \"4005cfa7-7eda-43d9-ba7f-fe06d42c82d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044106 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-service-ca\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044263 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fbe65a5-28e7-40db-85f7-66d00806dcbe-metrics-tls\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044380 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fbe65a5-28e7-40db-85f7-66d00806dcbe-bound-sa-token\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044548 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckfz\" (UniqueName: \"kubernetes.io/projected/4005cfa7-7eda-43d9-ba7f-fe06d42c82d2-kube-api-access-vckfz\") pod \"control-plane-machine-set-operator-78cbb6b69f-hg692\" (UID: \"4005cfa7-7eda-43d9-ba7f-fe06d42c82d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044665 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044798 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044840 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-service-ca\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044908 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.044989 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-audit-policies\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045020 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045093 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045121 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjt2p\" (UniqueName: \"kubernetes.io/projected/87c3b0a5-59be-438e-a074-b3f5b154039e-kube-api-access-cjt2p\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045155 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87c3b0a5-59be-438e-a074-b3f5b154039e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045177 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fbe65a5-28e7-40db-85f7-66d00806dcbe-trusted-ca\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045193 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045219 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045235 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hls7q\" (UniqueName: \"kubernetes.io/projected/09680d2b-7d6e-4dcd-bf38-d4642fe27ac2-kube-api-access-hls7q\") pod \"dns-operator-744455d44c-f7tp6\" (UID: \"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2\") " pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045255 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-ca\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045275 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-config\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045290 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/de05452b-7cdf-44da-a351-b21ba3691f41-signing-key\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045306 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62821e25-9412-4650-a9e0-34f4fe49656b-audit-dir\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045330 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6fzp\" (UniqueName: \"kubernetes.io/projected/4fbe65a5-28e7-40db-85f7-66d00806dcbe-kube-api-access-w6fzp\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045345 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rwb\" (UniqueName: \"kubernetes.io/projected/62821e25-9412-4650-a9e0-34f4fe49656b-kube-api-access-b5rwb\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045465 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045484 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c3b0a5-59be-438e-a074-b3f5b154039e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045518 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqhht\" (UniqueName: \"kubernetes.io/projected/c0514d31-211c-4b78-b2a3-8536fe75604d-kube-api-access-xqhht\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045534 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045561 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0514d31-211c-4b78-b2a3-8536fe75604d-serving-cert\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045583 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045601 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.045622 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwpfp\" (UniqueName: \"kubernetes.io/projected/de05452b-7cdf-44da-a351-b21ba3691f41-kube-api-access-nwpfp\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.046070 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-ca\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.046190 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0514d31-211c-4b78-b2a3-8536fe75604d-config\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.046286 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fbe65a5-28e7-40db-85f7-66d00806dcbe-trusted-ca\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.046497 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62821e25-9412-4650-a9e0-34f4fe49656b-audit-dir\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.046745 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87c3b0a5-59be-438e-a074-b3f5b154039e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.048505 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09680d2b-7d6e-4dcd-bf38-d4642fe27ac2-metrics-tls\") pod \"dns-operator-744455d44c-f7tp6\" (UID: \"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2\") " pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.048585 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87c3b0a5-59be-438e-a074-b3f5b154039e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.048825 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c0514d31-211c-4b78-b2a3-8536fe75604d-etcd-client\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.049181 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0514d31-211c-4b78-b2a3-8536fe75604d-serving-cert\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.049537 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/de05452b-7cdf-44da-a351-b21ba3691f41-signing-key\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.049617 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4fbe65a5-28e7-40db-85f7-66d00806dcbe-metrics-tls\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.053719 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.055079 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/de05452b-7cdf-44da-a351-b21ba3691f41-signing-cabundle\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.073686 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.094216 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.113512 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.133512 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.154528 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.173682 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.193814 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.214448 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.253773 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.274228 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.295271 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.313985 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.319843 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.354845 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.360011 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.361039 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.369060 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.374155 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.379680 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.393751 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.398895 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.414255 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.434241 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.439482 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.454120 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.462515 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.475163 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.488491 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.493813 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.496869 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-audit-policies\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.513915 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.517109 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.535659 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.537614 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.563348 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.567813 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.575784 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.594623 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.614123 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.635001 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.654472 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.674713 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.694229 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.701196 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4005cfa7-7eda-43d9-ba7f-fe06d42c82d2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hg692\" (UID: \"4005cfa7-7eda-43d9-ba7f-fe06d42c82d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.714610 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.754793 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.774999 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.794532 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.814361 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.834580 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.854879 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.874122 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.894803 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.912824 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.914148 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.934671 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.961210 4948 request.go:700] Waited for 1.025638114s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.963061 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.973741 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 17:27:16 crc kubenswrapper[4948]: I1204 17:27:16.995640 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.020313 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.033821 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.054491 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.077500 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.093659 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.114248 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.134259 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.154591 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.188164 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6trm\" (UniqueName: \"kubernetes.io/projected/1fb6542e-ebb3-4df7-95d3-7c6c55fcd845-kube-api-access-j6trm\") pod \"machine-api-operator-5694c8668f-xqmtt\" (UID: \"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.194568 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.214291 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.233712 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.253799 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.303354 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqh5\" (UniqueName: \"kubernetes.io/projected/09e20532-f709-4854-82c2-7b84e2d62950-kube-api-access-4qqh5\") pod \"openshift-apiserver-operator-796bbdcf4f-npdsj\" (UID: \"09e20532-f709-4854-82c2-7b84e2d62950\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.338638 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnrt6\" (UniqueName: \"kubernetes.io/projected/771c1e0f-69a0-4bf2-8345-37ed755de8ff-kube-api-access-lnrt6\") pod \"route-controller-manager-6576b87f9c-m2wdp\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.348927 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.360823 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57q2\" (UniqueName: \"kubernetes.io/projected/6f1b9652-58ed-4708-8cae-58cf5b66d439-kube-api-access-x57q2\") pod \"console-operator-58897d9998-xfvmf\" (UID: \"6f1b9652-58ed-4708-8cae-58cf5b66d439\") " pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.364858 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.394632 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.402383 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7hvm\" (UniqueName: \"kubernetes.io/projected/357c70a4-c799-43ba-8d28-ca99269d41fc-kube-api-access-l7hvm\") pod \"apiserver-76f77b778f-trshb\" (UID: \"357c70a4-c799-43ba-8d28-ca99269d41fc\") " pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.414598 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.415010 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.426683 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.434099 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.454239 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.476871 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.494286 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.514059 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.534243 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.557861 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.574382 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.594208 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.613677 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.615724 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.634154 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.654310 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.674173 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.707669 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.714893 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.734843 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.740422 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xqmtt"] Dec 04 17:27:17 crc kubenswrapper[4948]: W1204 17:27:17.749167 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fb6542e_ebb3_4df7_95d3_7c6c55fcd845.slice/crio-5fd1f7b266ee192ea1da8099a8c9183708bb1dafa7d545d1a016aee1c986ef72 WatchSource:0}: Error finding container 5fd1f7b266ee192ea1da8099a8c9183708bb1dafa7d545d1a016aee1c986ef72: Status 404 returned error can't find the container with id 5fd1f7b266ee192ea1da8099a8c9183708bb1dafa7d545d1a016aee1c986ef72 Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.753729 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.774034 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.796978 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.813225 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.829380 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj"] Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.830738 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xfvmf"] Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.832643 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp"] Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.833312 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.834113 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-trshb"] Dec 04 17:27:17 crc kubenswrapper[4948]: W1204 17:27:17.837285 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod771c1e0f_69a0_4bf2_8345_37ed755de8ff.slice/crio-6d56fc569bdb59ff329acbc4acb74a0ed2382cb76d25007d5e021e6916dc308f WatchSource:0}: Error finding container 6d56fc569bdb59ff329acbc4acb74a0ed2382cb76d25007d5e021e6916dc308f: Status 404 returned error can't find the container with id 6d56fc569bdb59ff329acbc4acb74a0ed2382cb76d25007d5e021e6916dc308f Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.852597 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.878240 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.894834 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.913030 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.933655 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.954682 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.972703 4948 request.go:700] Waited for 1.963298948s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.974173 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 17:27:17 crc kubenswrapper[4948]: I1204 17:27:17.993803 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.014037 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.035094 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.054389 4948 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.073980 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.119354 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fbe65a5-28e7-40db-85f7-66d00806dcbe-bound-sa-token\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.129959 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckfz\" (UniqueName: \"kubernetes.io/projected/4005cfa7-7eda-43d9-ba7f-fe06d42c82d2-kube-api-access-vckfz\") pod \"control-plane-machine-set-operator-78cbb6b69f-hg692\" (UID: \"4005cfa7-7eda-43d9-ba7f-fe06d42c82d2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.150776 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwpfp\" (UniqueName: \"kubernetes.io/projected/de05452b-7cdf-44da-a351-b21ba3691f41-kube-api-access-nwpfp\") pod \"service-ca-9c57cc56f-d6slp\" (UID: \"de05452b-7cdf-44da-a351-b21ba3691f41\") " pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.174992 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjt2p\" (UniqueName: \"kubernetes.io/projected/87c3b0a5-59be-438e-a074-b3f5b154039e-kube-api-access-cjt2p\") pod \"openshift-controller-manager-operator-756b6f6bc6-bvs4t\" (UID: \"87c3b0a5-59be-438e-a074-b3f5b154039e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.189868 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqhht\" (UniqueName: \"kubernetes.io/projected/c0514d31-211c-4b78-b2a3-8536fe75604d-kube-api-access-xqhht\") pod \"etcd-operator-b45778765-5275t\" (UID: \"c0514d31-211c-4b78-b2a3-8536fe75604d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.203308 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.208582 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hls7q\" (UniqueName: \"kubernetes.io/projected/09680d2b-7d6e-4dcd-bf38-d4642fe27ac2-kube-api-access-hls7q\") pod \"dns-operator-744455d44c-f7tp6\" (UID: \"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2\") " pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.232021 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6fzp\" (UniqueName: \"kubernetes.io/projected/4fbe65a5-28e7-40db-85f7-66d00806dcbe-kube-api-access-w6fzp\") pod \"ingress-operator-5b745b69d9-595fv\" (UID: \"4fbe65a5-28e7-40db-85f7-66d00806dcbe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.254080 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.255173 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rwb\" (UniqueName: \"kubernetes.io/projected/62821e25-9412-4650-a9e0-34f4fe49656b-kube-api-access-b5rwb\") pod \"oauth-openshift-558db77b4-2tl2h\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.294800 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.314127 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.316175 4948 projected.go:288] Couldn't get configMap openshift-console/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.316238 4948 projected.go:194] Error preparing data for projected volume kube-api-access-2g6gb for pod openshift-console/downloads-7954f5f757-7jtdw: failed to sync configmap cache: timed out waiting for the condition Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.316329 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3556602-2a66-48fb-a187-85849f5c08e4-kube-api-access-2g6gb podName:c3556602-2a66-48fb-a187-85849f5c08e4 nodeName:}" failed. No retries permitted until 2025-12-04 17:27:18.816309264 +0000 UTC m=+50.177383666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2g6gb" (UniqueName: "kubernetes.io/projected/c3556602-2a66-48fb-a187-85849f5c08e4-kube-api-access-2g6gb") pod "downloads-7954f5f757-7jtdw" (UID: "c3556602-2a66-48fb-a187-85849f5c08e4") : failed to sync configmap cache: timed out waiting for the condition Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.334738 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.340456 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtp4d\" (UniqueName: \"kubernetes.io/projected/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-kube-api-access-dtp4d\") pod \"console-f9d7485db-m5k2z\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.491110 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-trshb" event={"ID":"357c70a4-c799-43ba-8d28-ca99269d41fc","Type":"ContainerStarted","Data":"04fc77b3d74a9fb7b61086cdf0b225ad06a545db0ef9638d31fdf910f9bb7353"} Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.492203 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" event={"ID":"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845","Type":"ContainerStarted","Data":"5fd1f7b266ee192ea1da8099a8c9183708bb1dafa7d545d1a016aee1c986ef72"} Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.493323 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xfvmf" event={"ID":"6f1b9652-58ed-4708-8cae-58cf5b66d439","Type":"ContainerStarted","Data":"da62b175bc8313c78666ff5e8b312c99652aaa726f50ab2b780b5a78619d3d77"} Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.494491 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" event={"ID":"771c1e0f-69a0-4bf2-8345-37ed755de8ff","Type":"ContainerStarted","Data":"6d56fc569bdb59ff329acbc4acb74a0ed2382cb76d25007d5e021e6916dc308f"} Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.495487 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" event={"ID":"09e20532-f709-4854-82c2-7b84e2d62950","Type":"ContainerStarted","Data":"7ea4a30417be0d9c94d0dd465b036ea4908cdb59ebca37e812e7dc39d7a9cef5"} Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.499015 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.499222 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.499351 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.499768 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.502231 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rpg\" (UniqueName: \"kubernetes.io/projected/484ee778-914d-4c53-aa0e-6383472e1ebd-kube-api-access-m6rpg\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503146 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-config\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503308 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b106bf37-a37c-45a8-be6a-296e7288eb80-config\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503411 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5779a86-4384-4d85-8235-be7dfedc7c68-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503506 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8c015fa-00f0-4670-990a-e830b7762674-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503580 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c5779a86-4384-4d85-8235-be7dfedc7c68-images\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503667 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfsw4\" (UniqueName: \"kubernetes.io/projected/720d0657-f05b-415e-a89b-cec265b15235-kube-api-access-kfsw4\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503704 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503796 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8c015fa-00f0-4670-990a-e830b7762674-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503875 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.503978 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr5zq\" (UniqueName: \"kubernetes.io/projected/d8c015fa-00f0-4670-990a-e830b7762674-kube-api-access-lr5zq\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.504114 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-tls\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.504187 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-encryption-config\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.504302 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75j7\" (UniqueName: \"kubernetes.io/projected/f7dc39c4-5e34-4d07-909f-85761440a108-kube-api-access-l75j7\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505274 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f7dc39c4-5e34-4d07-909f-85761440a108-audit-dir\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505329 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j4pg\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-kube-api-access-2j4pg\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505370 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-default-certificate\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505437 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-stats-auth\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505467 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b106bf37-a37c-45a8-be6a-296e7288eb80-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505523 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-metrics-certs\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505627 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505680 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-config\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505717 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505949 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8c015fa-00f0-4670-990a-e830b7762674-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.505992 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/484ee778-914d-4c53-aa0e-6383472e1ebd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.506182 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-certificates\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.506512 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ef18d3-fb9b-46f0-82a0-4db3172f43a7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cjplw\" (UID: \"03ef18d3-fb9b-46f0-82a0-4db3172f43a7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.506572 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.506608 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720d0657-f05b-415e-a89b-cec265b15235-service-ca-bundle\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.506643 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-serving-cert\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.506695 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-service-ca-bundle\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.506769 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-etcd-client\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.507711 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-serving-cert\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.507751 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-bound-sa-token\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.507786 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5779a86-4384-4d85-8235-be7dfedc7c68-proxy-tls\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509071 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-trusted-ca\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509136 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-serving-cert\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509170 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509189 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-config\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509206 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22n7b\" (UniqueName: \"kubernetes.io/projected/03ef18d3-fb9b-46f0-82a0-4db3172f43a7-kube-api-access-22n7b\") pod \"package-server-manager-789f6589d5-cjplw\" (UID: \"03ef18d3-fb9b-46f0-82a0-4db3172f43a7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509227 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nk4n\" (UniqueName: \"kubernetes.io/projected/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-kube-api-access-7nk4n\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509247 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b106bf37-a37c-45a8-be6a-296e7288eb80-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509267 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/484ee778-914d-4c53-aa0e-6383472e1ebd-serving-cert\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509324 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-audit-policies\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509344 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509417 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-client-ca\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509464 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dwn4\" (UniqueName: \"kubernetes.io/projected/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-kube-api-access-4dwn4\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509486 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509509 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.509531 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcw4l\" (UniqueName: \"kubernetes.io/projected/c5779a86-4384-4d85-8235-be7dfedc7c68-kube-api-access-mcw4l\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.510193 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.010180484 +0000 UTC m=+50.371254886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.525108 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.542175 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.610224 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.610394 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.110351507 +0000 UTC m=+50.471425959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.611292 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c5779a86-4384-4d85-8235-be7dfedc7c68-images\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.611351 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9126688-8fd4-46db-8188-dc8014777a8d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.611389 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702984bc-83a3-4da1-bd02-f8879e78502d-config-volume\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.611426 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.611463 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfsw4\" (UniqueName: \"kubernetes.io/projected/720d0657-f05b-415e-a89b-cec265b15235-kube-api-access-kfsw4\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.611499 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.611535 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97da035d-1f0b-4a53-bc43-04b3a495eda9-webhook-cert\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.611570 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612148 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e7e78009-45e5-40d0-a208-d0996554a35e-node-bootstrap-token\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612198 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8c015fa-00f0-4670-990a-e830b7762674-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612233 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612265 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/090a1667-3d12-491d-96d4-3efddf82b503-cert\") pod \"ingress-canary-8k5hb\" (UID: \"090a1667-3d12-491d-96d4-3efddf82b503\") " pod="openshift-ingress-canary/ingress-canary-8k5hb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612312 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c5779a86-4384-4d85-8235-be7dfedc7c68-images\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612339 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-registration-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612386 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kd7t\" (UniqueName: \"kubernetes.io/projected/e7e78009-45e5-40d0-a208-d0996554a35e-kube-api-access-5kd7t\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612747 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612781 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr5zq\" (UniqueName: \"kubernetes.io/projected/d8c015fa-00f0-4670-990a-e830b7762674-kube-api-access-lr5zq\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.612910 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/532c984d-78f6-4e46-be62-53cb87748bcb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fltqd\" (UID: \"532c984d-78f6-4e46-be62-53cb87748bcb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613088 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-tls\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613700 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613719 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-encryption-config\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613755 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhfw\" (UniqueName: \"kubernetes.io/projected/7c394cd7-6d7c-4880-911d-cc27cc380a17-kube-api-access-qhhfw\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613779 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39ae03b6-0da8-43f7-84d2-300f5d0648af-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613831 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l75j7\" (UniqueName: \"kubernetes.io/projected/f7dc39c4-5e34-4d07-909f-85761440a108-kube-api-access-l75j7\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613853 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/39ae03b6-0da8-43f7-84d2-300f5d0648af-ready\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613876 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjfg\" (UniqueName: \"kubernetes.io/projected/702984bc-83a3-4da1-bd02-f8879e78502d-kube-api-access-dtjfg\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613915 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f7dc39c4-5e34-4d07-909f-85761440a108-audit-dir\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613939 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj4jp\" (UniqueName: \"kubernetes.io/projected/39ae03b6-0da8-43f7-84d2-300f5d0648af-kube-api-access-zj4jp\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613962 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j4pg\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-kube-api-access-2j4pg\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.613985 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-default-certificate\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614008 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-stats-auth\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614030 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b106bf37-a37c-45a8-be6a-296e7288eb80-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614081 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5237623-6755-491c-8345-90f85db04335-config\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614104 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/280ee280-d01c-4e3e-8390-69e6eb19a579-profile-collector-cert\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614141 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614168 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7zxn\" (UniqueName: \"kubernetes.io/projected/9e9c7581-db86-4c8a-9692-3fcf07b99c42-kube-api-access-p7zxn\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614213 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-metrics-certs\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614352 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f7dc39c4-5e34-4d07-909f-85761440a108-audit-dir\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.614404 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r995f\" (UniqueName: \"kubernetes.io/projected/b6df6df8-563e-4d8a-b9e5-29250531a399-kube-api-access-r995f\") pod \"cluster-samples-operator-665b6dd947-pcnsb\" (UID: \"b6df6df8-563e-4d8a-b9e5-29250531a399\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.615626 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/280ee280-d01c-4e3e-8390-69e6eb19a579-srv-cert\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.615684 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702984bc-83a3-4da1-bd02-f8879e78502d-secret-volume\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.615761 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.615812 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-config\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.615851 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c394cd7-6d7c-4880-911d-cc27cc380a17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.615931 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.615966 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-plugins-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.615998 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd8tc\" (UniqueName: \"kubernetes.io/projected/96f5544e-1a2b-4d58-9d9c-799509953821-kube-api-access-fd8tc\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616119 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8c015fa-00f0-4670-990a-e830b7762674-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616164 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9126688-8fd4-46db-8188-dc8014777a8d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616230 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/484ee778-914d-4c53-aa0e-6383472e1ebd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616263 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e7e78009-45e5-40d0-a208-d0996554a35e-certs\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616316 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6wps\" (UniqueName: \"kubernetes.io/projected/280ee280-d01c-4e3e-8390-69e6eb19a579-kube-api-access-j6wps\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616351 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xrf7\" (UniqueName: \"kubernetes.io/projected/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-kube-api-access-5xrf7\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616386 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-certificates\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616422 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616455 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-csi-data-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616529 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ef18d3-fb9b-46f0-82a0-4db3172f43a7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cjplw\" (UID: \"03ef18d3-fb9b-46f0-82a0-4db3172f43a7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616566 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720d0657-f05b-415e-a89b-cec265b15235-service-ca-bundle\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616600 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rlfn\" (UniqueName: \"kubernetes.io/projected/090a1667-3d12-491d-96d4-3efddf82b503-kube-api-access-9rlfn\") pod \"ingress-canary-8k5hb\" (UID: \"090a1667-3d12-491d-96d4-3efddf82b503\") " pod="openshift-ingress-canary/ingress-canary-8k5hb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616635 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616667 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-serving-cert\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616701 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-service-ca-bundle\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616732 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-etcd-client\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616764 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-serving-cert\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616798 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5237623-6755-491c-8345-90f85db04335-auth-proxy-config\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616831 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97da035d-1f0b-4a53-bc43-04b3a495eda9-tmpfs\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616862 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g7kc\" (UniqueName: \"kubernetes.io/projected/532c984d-78f6-4e46-be62-53cb87748bcb-kube-api-access-7g7kc\") pod \"multus-admission-controller-857f4d67dd-fltqd\" (UID: \"532c984d-78f6-4e46-be62-53cb87748bcb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616901 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-bound-sa-token\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616933 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5779a86-4384-4d85-8235-be7dfedc7c68-proxy-tls\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.616988 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.617020 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e9c7581-db86-4c8a-9692-3fcf07b99c42-config\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.617115 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-trusted-ca\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.617486 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-encryption-config\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.617782 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.618573 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/484ee778-914d-4c53-aa0e-6383472e1ebd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.619765 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-certificates\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.620545 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-serving-cert\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.620613 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-config\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.620628 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.620755 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-config\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.620808 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-mountpoint-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.620888 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nk4n\" (UniqueName: \"kubernetes.io/projected/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-kube-api-access-7nk4n\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.620954 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b106bf37-a37c-45a8-be6a-296e7288eb80-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621007 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97da035d-1f0b-4a53-bc43-04b3a495eda9-apiservice-cert\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621131 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22n7b\" (UniqueName: \"kubernetes.io/projected/03ef18d3-fb9b-46f0-82a0-4db3172f43a7-kube-api-access-22n7b\") pod \"package-server-manager-789f6589d5-cjplw\" (UID: \"03ef18d3-fb9b-46f0-82a0-4db3172f43a7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621246 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/484ee778-914d-4c53-aa0e-6383472e1ebd-serving-cert\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621308 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39ae03b6-0da8-43f7-84d2-300f5d0648af-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621415 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8hq4\" (UniqueName: \"kubernetes.io/projected/c9126688-8fd4-46db-8188-dc8014777a8d-kube-api-access-g8hq4\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621480 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621536 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-audit-policies\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621610 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-client-ca\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621662 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c394cd7-6d7c-4880-911d-cc27cc380a17-srv-cert\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621710 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/218eaa87-4b22-4db2-8ff0-174995db7128-metrics-tls\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621788 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trzsn\" (UniqueName: \"kubernetes.io/projected/97da035d-1f0b-4a53-bc43-04b3a495eda9-kube-api-access-trzsn\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621845 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621898 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dwn4\" (UniqueName: \"kubernetes.io/projected/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-kube-api-access-4dwn4\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.621954 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c5237623-6755-491c-8345-90f85db04335-machine-approver-tls\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622004 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622167 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfqq\" (UniqueName: \"kubernetes.io/projected/c5237623-6755-491c-8345-90f85db04335-kube-api-access-hjfqq\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622257 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622310 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcw4l\" (UniqueName: \"kubernetes.io/projected/c5779a86-4384-4d85-8235-be7dfedc7c68-kube-api-access-mcw4l\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622364 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rpg\" (UniqueName: \"kubernetes.io/projected/484ee778-914d-4c53-aa0e-6383472e1ebd-kube-api-access-m6rpg\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622407 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8c015fa-00f0-4670-990a-e830b7762674-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622418 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-config\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622500 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/218eaa87-4b22-4db2-8ff0-174995db7128-config-volume\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622543 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptzz6\" (UniqueName: \"kubernetes.io/projected/218eaa87-4b22-4db2-8ff0-174995db7128-kube-api-access-ptzz6\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622579 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9c7581-db86-4c8a-9692-3fcf07b99c42-serving-cert\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622696 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b106bf37-a37c-45a8-be6a-296e7288eb80-config\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622734 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5779a86-4384-4d85-8235-be7dfedc7c68-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622768 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-proxy-tls\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622801 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6df6df8-563e-4d8a-b9e5-29250531a399-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pcnsb\" (UID: \"b6df6df8-563e-4d8a-b9e5-29250531a399\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622859 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b529v\" (UniqueName: \"kubernetes.io/projected/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-kube-api-access-b529v\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622916 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8c015fa-00f0-4670-990a-e830b7762674-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.622951 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkmwj\" (UniqueName: \"kubernetes.io/projected/b7650dc5-9e1f-49e4-98f8-45836883f728-kube-api-access-tkmwj\") pod \"migrator-59844c95c7-j8pfs\" (UID: \"b7650dc5-9e1f-49e4-98f8-45836883f728\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.623004 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-socket-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.623141 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-serving-cert\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.623995 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.625084 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b106bf37-a37c-45a8-be6a-296e7288eb80-config\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.625473 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5779a86-4384-4d85-8235-be7dfedc7c68-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.626887 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8c015fa-00f0-4670-990a-e830b7762674-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.627744 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.628422 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-service-ca-bundle\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.629544 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ef18d3-fb9b-46f0-82a0-4db3172f43a7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cjplw\" (UID: \"03ef18d3-fb9b-46f0-82a0-4db3172f43a7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.629857 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-tls\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.631758 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.131738981 +0000 UTC m=+50.492813383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.632085 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-config\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.632197 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f7dc39c4-5e34-4d07-909f-85761440a108-audit-policies\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.632445 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-serving-cert\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.632649 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-trusted-ca\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.632835 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f7dc39c4-5e34-4d07-909f-85761440a108-etcd-client\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.633512 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-config\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.634182 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-client-ca\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.635945 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b106bf37-a37c-45a8-be6a-296e7288eb80-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.636106 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.636737 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c5779a86-4384-4d85-8235-be7dfedc7c68-proxy-tls\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.636877 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.642029 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/484ee778-914d-4c53-aa0e-6383472e1ebd-serving-cert\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.642777 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-serving-cert\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.671430 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8c015fa-00f0-4670-990a-e830b7762674-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.699258 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr5zq\" (UniqueName: \"kubernetes.io/projected/d8c015fa-00f0-4670-990a-e830b7762674-kube-api-access-lr5zq\") pod \"cluster-image-registry-operator-dc59b4c8b-ltp95\" (UID: \"d8c015fa-00f0-4670-990a-e830b7762674\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.700015 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.717345 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75j7\" (UniqueName: \"kubernetes.io/projected/f7dc39c4-5e34-4d07-909f-85761440a108-kube-api-access-l75j7\") pod \"apiserver-7bbb656c7d-kk9wl\" (UID: \"f7dc39c4-5e34-4d07-909f-85761440a108\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.725567 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.725981 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726001 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e9c7581-db86-4c8a-9692-3fcf07b99c42-config\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726030 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-mountpoint-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726085 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97da035d-1f0b-4a53-bc43-04b3a495eda9-apiservice-cert\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726115 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39ae03b6-0da8-43f7-84d2-300f5d0648af-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726139 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8hq4\" (UniqueName: \"kubernetes.io/projected/c9126688-8fd4-46db-8188-dc8014777a8d-kube-api-access-g8hq4\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726156 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c394cd7-6d7c-4880-911d-cc27cc380a17-srv-cert\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726170 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/218eaa87-4b22-4db2-8ff0-174995db7128-metrics-tls\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726196 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trzsn\" (UniqueName: \"kubernetes.io/projected/97da035d-1f0b-4a53-bc43-04b3a495eda9-kube-api-access-trzsn\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726216 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c5237623-6755-491c-8345-90f85db04335-machine-approver-tls\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726234 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726248 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfqq\" (UniqueName: \"kubernetes.io/projected/c5237623-6755-491c-8345-90f85db04335-kube-api-access-hjfqq\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726281 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/218eaa87-4b22-4db2-8ff0-174995db7128-config-volume\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726299 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptzz6\" (UniqueName: \"kubernetes.io/projected/218eaa87-4b22-4db2-8ff0-174995db7128-kube-api-access-ptzz6\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726345 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9c7581-db86-4c8a-9692-3fcf07b99c42-serving-cert\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726366 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-proxy-tls\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726383 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6df6df8-563e-4d8a-b9e5-29250531a399-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pcnsb\" (UID: \"b6df6df8-563e-4d8a-b9e5-29250531a399\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726399 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b529v\" (UniqueName: \"kubernetes.io/projected/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-kube-api-access-b529v\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726418 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkmwj\" (UniqueName: \"kubernetes.io/projected/b7650dc5-9e1f-49e4-98f8-45836883f728-kube-api-access-tkmwj\") pod \"migrator-59844c95c7-j8pfs\" (UID: \"b7650dc5-9e1f-49e4-98f8-45836883f728\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726435 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-socket-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726449 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9126688-8fd4-46db-8188-dc8014777a8d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726465 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702984bc-83a3-4da1-bd02-f8879e78502d-config-volume\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726481 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726505 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726518 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97da035d-1f0b-4a53-bc43-04b3a495eda9-webhook-cert\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726534 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e7e78009-45e5-40d0-a208-d0996554a35e-node-bootstrap-token\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726550 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/090a1667-3d12-491d-96d4-3efddf82b503-cert\") pod \"ingress-canary-8k5hb\" (UID: \"090a1667-3d12-491d-96d4-3efddf82b503\") " pod="openshift-ingress-canary/ingress-canary-8k5hb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726565 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-registration-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726589 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kd7t\" (UniqueName: \"kubernetes.io/projected/e7e78009-45e5-40d0-a208-d0996554a35e-kube-api-access-5kd7t\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726612 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/532c984d-78f6-4e46-be62-53cb87748bcb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fltqd\" (UID: \"532c984d-78f6-4e46-be62-53cb87748bcb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726630 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhfw\" (UniqueName: \"kubernetes.io/projected/7c394cd7-6d7c-4880-911d-cc27cc380a17-kube-api-access-qhhfw\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726645 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39ae03b6-0da8-43f7-84d2-300f5d0648af-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726663 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/39ae03b6-0da8-43f7-84d2-300f5d0648af-ready\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726678 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjfg\" (UniqueName: \"kubernetes.io/projected/702984bc-83a3-4da1-bd02-f8879e78502d-kube-api-access-dtjfg\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726694 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj4jp\" (UniqueName: \"kubernetes.io/projected/39ae03b6-0da8-43f7-84d2-300f5d0648af-kube-api-access-zj4jp\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726723 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5237623-6755-491c-8345-90f85db04335-config\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726738 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/280ee280-d01c-4e3e-8390-69e6eb19a579-profile-collector-cert\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726752 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726767 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7zxn\" (UniqueName: \"kubernetes.io/projected/9e9c7581-db86-4c8a-9692-3fcf07b99c42-kube-api-access-p7zxn\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726789 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r995f\" (UniqueName: \"kubernetes.io/projected/b6df6df8-563e-4d8a-b9e5-29250531a399-kube-api-access-r995f\") pod \"cluster-samples-operator-665b6dd947-pcnsb\" (UID: \"b6df6df8-563e-4d8a-b9e5-29250531a399\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726814 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/280ee280-d01c-4e3e-8390-69e6eb19a579-srv-cert\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726833 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702984bc-83a3-4da1-bd02-f8879e78502d-secret-volume\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726857 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c394cd7-6d7c-4880-911d-cc27cc380a17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726872 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-plugins-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726888 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd8tc\" (UniqueName: \"kubernetes.io/projected/96f5544e-1a2b-4d58-9d9c-799509953821-kube-api-access-fd8tc\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726903 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9126688-8fd4-46db-8188-dc8014777a8d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726920 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e7e78009-45e5-40d0-a208-d0996554a35e-certs\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726937 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6wps\" (UniqueName: \"kubernetes.io/projected/280ee280-d01c-4e3e-8390-69e6eb19a579-kube-api-access-j6wps\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726952 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xrf7\" (UniqueName: \"kubernetes.io/projected/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-kube-api-access-5xrf7\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726967 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.726982 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-csi-data-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.727005 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rlfn\" (UniqueName: \"kubernetes.io/projected/090a1667-3d12-491d-96d4-3efddf82b503-kube-api-access-9rlfn\") pod \"ingress-canary-8k5hb\" (UID: \"090a1667-3d12-491d-96d4-3efddf82b503\") " pod="openshift-ingress-canary/ingress-canary-8k5hb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.727022 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5237623-6755-491c-8345-90f85db04335-auth-proxy-config\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.727035 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97da035d-1f0b-4a53-bc43-04b3a495eda9-tmpfs\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.727081 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g7kc\" (UniqueName: \"kubernetes.io/projected/532c984d-78f6-4e46-be62-53cb87748bcb-kube-api-access-7g7kc\") pod \"multus-admission-controller-857f4d67dd-fltqd\" (UID: \"532c984d-78f6-4e46-be62-53cb87748bcb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.727297 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.227277314 +0000 UTC m=+50.588351716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.727675 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-registration-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.728230 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e9c7581-db86-4c8a-9692-3fcf07b99c42-config\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.728275 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-mountpoint-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.731833 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97da035d-1f0b-4a53-bc43-04b3a495eda9-apiservice-cert\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.732925 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/280ee280-d01c-4e3e-8390-69e6eb19a579-srv-cert\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.734285 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5237623-6755-491c-8345-90f85db04335-config\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.734738 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39ae03b6-0da8-43f7-84d2-300f5d0648af-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.735946 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/218eaa87-4b22-4db2-8ff0-174995db7128-config-volume\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.736215 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-proxy-tls\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.736475 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-csi-data-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.736542 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.737409 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9126688-8fd4-46db-8188-dc8014777a8d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.737812 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39ae03b6-0da8-43f7-84d2-300f5d0648af-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.737967 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.738967 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-plugins-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.739445 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702984bc-83a3-4da1-bd02-f8879e78502d-config-volume\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.739527 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/532c984d-78f6-4e46-be62-53cb87748bcb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fltqd\" (UID: \"532c984d-78f6-4e46-be62-53cb87748bcb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.739954 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/280ee280-d01c-4e3e-8390-69e6eb19a579-profile-collector-cert\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.740030 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c5237623-6755-491c-8345-90f85db04335-auth-proxy-config\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.742736 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/96f5544e-1a2b-4d58-9d9c-799509953821-socket-dir\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.742950 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/39ae03b6-0da8-43f7-84d2-300f5d0648af-ready\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.742977 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97da035d-1f0b-4a53-bc43-04b3a495eda9-tmpfs\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.743546 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e7e78009-45e5-40d0-a208-d0996554a35e-certs\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.744411 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/090a1667-3d12-491d-96d4-3efddf82b503-cert\") pod \"ingress-canary-8k5hb\" (UID: \"090a1667-3d12-491d-96d4-3efddf82b503\") " pod="openshift-ingress-canary/ingress-canary-8k5hb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.744803 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e7e78009-45e5-40d0-a208-d0996554a35e-node-bootstrap-token\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.745611 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/218eaa87-4b22-4db2-8ff0-174995db7128-metrics-tls\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.745864 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97da035d-1f0b-4a53-bc43-04b3a495eda9-webhook-cert\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.748664 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nk4n\" (UniqueName: \"kubernetes.io/projected/4d1c5f04-f0c8-4865-bdba-4347d9840bfb-kube-api-access-7nk4n\") pod \"authentication-operator-69f744f599-4qwlk\" (UID: \"4d1c5f04-f0c8-4865-bdba-4347d9840bfb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.749526 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.749898 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9126688-8fd4-46db-8188-dc8014777a8d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.753191 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702984bc-83a3-4da1-bd02-f8879e78502d-secret-volume\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.765595 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-595fv"] Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.767580 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b106bf37-a37c-45a8-be6a-296e7288eb80-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p457n\" (UID: \"b106bf37-a37c-45a8-be6a-296e7288eb80\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.779439 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:18 crc kubenswrapper[4948]: W1204 17:27:18.801932 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fbe65a5_28e7_40db_85f7_66d00806dcbe.slice/crio-2b7fc48bcab3276ea13fe7d8a3bc2c27cbc18de601e0a50a6e0b19df6bacedd1 WatchSource:0}: Error finding container 2b7fc48bcab3276ea13fe7d8a3bc2c27cbc18de601e0a50a6e0b19df6bacedd1: Status 404 returned error can't find the container with id 2b7fc48bcab3276ea13fe7d8a3bc2c27cbc18de601e0a50a6e0b19df6bacedd1 Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.820497 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d6slp"] Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.831913 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.832004 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6gb\" (UniqueName: \"kubernetes.io/projected/c3556602-2a66-48fb-a187-85849f5c08e4-kube-api-access-2g6gb\") pod \"downloads-7954f5f757-7jtdw\" (UID: \"c3556602-2a66-48fb-a187-85849f5c08e4\") " pod="openshift-console/downloads-7954f5f757-7jtdw" Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.832218 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.332201881 +0000 UTC m=+50.693276283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.834889 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-bound-sa-token\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.835530 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6gb\" (UniqueName: \"kubernetes.io/projected/c3556602-2a66-48fb-a187-85849f5c08e4-kube-api-access-2g6gb\") pod \"downloads-7954f5f757-7jtdw\" (UID: \"c3556602-2a66-48fb-a187-85849f5c08e4\") " pod="openshift-console/downloads-7954f5f757-7jtdw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.847350 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.847930 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692"] Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.848523 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dwn4\" (UniqueName: \"kubernetes.io/projected/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-kube-api-access-4dwn4\") pod \"controller-manager-879f6c89f-rz5j4\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.867578 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cmmdj\" (UID: \"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.872674 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7jtdw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.889334 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rpg\" (UniqueName: \"kubernetes.io/projected/484ee778-914d-4c53-aa0e-6383472e1ebd-kube-api-access-m6rpg\") pod \"openshift-config-operator-7777fb866f-hv9v6\" (UID: \"484ee778-914d-4c53-aa0e-6383472e1ebd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.907261 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcw4l\" (UniqueName: \"kubernetes.io/projected/c5779a86-4384-4d85-8235-be7dfedc7c68-kube-api-access-mcw4l\") pod \"machine-config-operator-74547568cd-mmxqn\" (UID: \"c5779a86-4384-4d85-8235-be7dfedc7c68\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.924946 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c5237623-6755-491c-8345-90f85db04335-machine-approver-tls\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.925166 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j4pg\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-kube-api-access-2j4pg\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.925977 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c394cd7-6d7c-4880-911d-cc27cc380a17-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.926524 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c394cd7-6d7c-4880-911d-cc27cc380a17-srv-cert\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.927454 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e9c7581-db86-4c8a-9692-3fcf07b99c42-serving-cert\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.927669 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720d0657-f05b-415e-a89b-cec265b15235-service-ca-bundle\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.928842 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b6df6df8-563e-4d8a-b9e5-29250531a399-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pcnsb\" (UID: \"b6df6df8-563e-4d8a-b9e5-29250531a399\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.928931 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-default-certificate\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.930552 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfsw4\" (UniqueName: \"kubernetes.io/projected/720d0657-f05b-415e-a89b-cec265b15235-kube-api-access-kfsw4\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.931806 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g7kc\" (UniqueName: \"kubernetes.io/projected/532c984d-78f6-4e46-be62-53cb87748bcb-kube-api-access-7g7kc\") pod \"multus-admission-controller-857f4d67dd-fltqd\" (UID: \"532c984d-78f6-4e46-be62-53cb87748bcb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.932997 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.935196 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-metrics-certs\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.935703 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/720d0657-f05b-415e-a89b-cec265b15235-stats-auth\") pod \"router-default-5444994796-zgswc\" (UID: \"720d0657-f05b-415e-a89b-cec265b15235\") " pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.936349 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.936522 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.436503801 +0000 UTC m=+50.797578203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.936734 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.941399 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:18 crc kubenswrapper[4948]: E1204 17:27:18.941901 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.4418858 +0000 UTC m=+50.802960202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.942194 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22n7b\" (UniqueName: \"kubernetes.io/projected/03ef18d3-fb9b-46f0-82a0-4db3172f43a7-kube-api-access-22n7b\") pod \"package-server-manager-789f6589d5-cjplw\" (UID: \"03ef18d3-fb9b-46f0-82a0-4db3172f43a7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.943923 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.950073 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.956598 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjfg\" (UniqueName: \"kubernetes.io/projected/702984bc-83a3-4da1-bd02-f8879e78502d-kube-api-access-dtjfg\") pod \"collect-profiles-29414475-l8jtf\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.979345 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kd7t\" (UniqueName: \"kubernetes.io/projected/e7e78009-45e5-40d0-a208-d0996554a35e-kube-api-access-5kd7t\") pod \"machine-config-server-xxt7h\" (UID: \"e7e78009-45e5-40d0-a208-d0996554a35e\") " pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:18 crc kubenswrapper[4948]: I1204 17:27:18.998285 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj4jp\" (UniqueName: \"kubernetes.io/projected/39ae03b6-0da8-43f7-84d2-300f5d0648af-kube-api-access-zj4jp\") pod \"cni-sysctl-allowlist-ds-h58bm\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.010375 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6wps\" (UniqueName: \"kubernetes.io/projected/280ee280-d01c-4e3e-8390-69e6eb19a579-kube-api-access-j6wps\") pod \"catalog-operator-68c6474976-kd2hw\" (UID: \"280ee280-d01c-4e3e-8390-69e6eb19a579\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.033370 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhfw\" (UniqueName: \"kubernetes.io/projected/7c394cd7-6d7c-4880-911d-cc27cc380a17-kube-api-access-qhhfw\") pod \"olm-operator-6b444d44fb-kdgm4\" (UID: \"7c394cd7-6d7c-4880-911d-cc27cc380a17\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.037562 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.037671 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.53764969 +0000 UTC m=+50.898724102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.037877 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.038130 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.538123272 +0000 UTC m=+50.899197674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.039669 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.039901 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.060337 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xxt7h" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.069780 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd8tc\" (UniqueName: \"kubernetes.io/projected/96f5544e-1a2b-4d58-9d9c-799509953821-kube-api-access-fd8tc\") pod \"csi-hostpathplugin-tsgwn\" (UID: \"96f5544e-1a2b-4d58-9d9c-799509953821\") " pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.078282 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.079604 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.085441 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7zxn\" (UniqueName: \"kubernetes.io/projected/9e9c7581-db86-4c8a-9692-3fcf07b99c42-kube-api-access-p7zxn\") pod \"service-ca-operator-777779d784-vnv9b\" (UID: \"9e9c7581-db86-4c8a-9692-3fcf07b99c42\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.087406 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.098608 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xgll2\" (UID: \"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.108421 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.111872 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptzz6\" (UniqueName: \"kubernetes.io/projected/218eaa87-4b22-4db2-8ff0-174995db7128-kube-api-access-ptzz6\") pod \"dns-default-tqbdj\" (UID: \"218eaa87-4b22-4db2-8ff0-174995db7128\") " pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.116865 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.137680 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xrf7\" (UniqueName: \"kubernetes.io/projected/9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72-kube-api-access-5xrf7\") pod \"machine-config-controller-84d6567774-d5zf7\" (UID: \"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.137952 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.138542 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.138761 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.638743747 +0000 UTC m=+50.999818149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.138929 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.139233 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.63922544 +0000 UTC m=+51.000299842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.161357 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfqq\" (UniqueName: \"kubernetes.io/projected/c5237623-6755-491c-8345-90f85db04335-kube-api-access-hjfqq\") pod \"machine-approver-56656f9798-wftpx\" (UID: \"c5237623-6755-491c-8345-90f85db04335\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.204701 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.204882 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r995f\" (UniqueName: \"kubernetes.io/projected/b6df6df8-563e-4d8a-b9e5-29250531a399-kube-api-access-r995f\") pod \"cluster-samples-operator-665b6dd947-pcnsb\" (UID: \"b6df6df8-563e-4d8a-b9e5-29250531a399\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.205455 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8hq4\" (UniqueName: \"kubernetes.io/projected/c9126688-8fd4-46db-8188-dc8014777a8d-kube-api-access-g8hq4\") pod \"kube-storage-version-migrator-operator-b67b599dd-m59pp\" (UID: \"c9126688-8fd4-46db-8188-dc8014777a8d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.205902 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.213589 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.242823 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.242999 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.243349 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.743332015 +0000 UTC m=+51.104406417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.250607 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkmwj\" (UniqueName: \"kubernetes.io/projected/b7650dc5-9e1f-49e4-98f8-45836883f728-kube-api-access-tkmwj\") pod \"migrator-59844c95c7-j8pfs\" (UID: \"b7650dc5-9e1f-49e4-98f8-45836883f728\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.251263 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trzsn\" (UniqueName: \"kubernetes.io/projected/97da035d-1f0b-4a53-bc43-04b3a495eda9-kube-api-access-trzsn\") pod \"packageserver-d55dfcdfc-ggrb4\" (UID: \"97da035d-1f0b-4a53-bc43-04b3a495eda9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.256688 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rlfn\" (UniqueName: \"kubernetes.io/projected/090a1667-3d12-491d-96d4-3efddf82b503-kube-api-access-9rlfn\") pod \"ingress-canary-8k5hb\" (UID: \"090a1667-3d12-491d-96d4-3efddf82b503\") " pod="openshift-ingress-canary/ingress-canary-8k5hb" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.259460 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.269475 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b529v\" (UniqueName: \"kubernetes.io/projected/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-kube-api-access-b529v\") pod \"marketplace-operator-79b997595-hbqk5\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.271742 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.286989 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.293995 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.303643 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.314948 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.329894 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.339859 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-m5k2z"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.344208 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.346534 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.846517716 +0000 UTC m=+51.207592118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.352411 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8k5hb" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.370351 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.378273 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5275t"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.444845 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.445033 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.945006656 +0000 UTC m=+51.306081058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.445117 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.445433 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:19.945418197 +0000 UTC m=+51.306492689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.487125 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fltqd"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.488029 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.492471 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2tl2h"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.500566 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" event={"ID":"4005cfa7-7eda-43d9-ba7f-fe06d42c82d2","Type":"ContainerStarted","Data":"33d2ea4762dbd5abe585ddd0e7891f45600b076ed696d4825c5a7907bc56abd0"} Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.501723 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" event={"ID":"de05452b-7cdf-44da-a351-b21ba3691f41","Type":"ContainerStarted","Data":"ed31beb15504e7d1241f262ac6408508446ea617ca3a6d7b65107c330b39161e"} Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.502603 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" event={"ID":"4fbe65a5-28e7-40db-85f7-66d00806dcbe","Type":"ContainerStarted","Data":"2b7fc48bcab3276ea13fe7d8a3bc2c27cbc18de601e0a50a6e0b19df6bacedd1"} Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.504591 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" event={"ID":"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845","Type":"ContainerStarted","Data":"4b57c1976c76574076b494813d2142804359f7cc72a11c368cafee399ea45f9d"} Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.507101 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.524659 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.531623 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.546490 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.546685 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.046657208 +0000 UTC m=+51.407731610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.546881 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.547209 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.047202632 +0000 UTC m=+51.408277034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.552931 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-f7tp6"] Dec 04 17:27:19 crc kubenswrapper[4948]: W1204 17:27:19.567263 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80f2233_6a99_49c2_a8fc_1bb335b2dd79.slice/crio-08362ca2cf4b27503604d585ec31d395f8785554434340773d17fc0ad06d0a8f WatchSource:0}: Error finding container 08362ca2cf4b27503604d585ec31d395f8785554434340773d17fc0ad06d0a8f: Status 404 returned error can't find the container with id 08362ca2cf4b27503604d585ec31d395f8785554434340773d17fc0ad06d0a8f Dec 04 17:27:19 crc kubenswrapper[4948]: W1204 17:27:19.572805 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod532c984d_78f6_4e46_be62_53cb87748bcb.slice/crio-4972cd737cd3997295894a0de364c69a35fc81787a20ca180335aa9f83d61dfd WatchSource:0}: Error finding container 4972cd737cd3997295894a0de364c69a35fc81787a20ca180335aa9f83d61dfd: Status 404 returned error can't find the container with id 4972cd737cd3997295894a0de364c69a35fc81787a20ca180335aa9f83d61dfd Dec 04 17:27:19 crc kubenswrapper[4948]: W1204 17:27:19.573799 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5779a86_4384_4d85_8235_be7dfedc7c68.slice/crio-5e9886394e141893b3037ece8eb0acc2ec437bfa2c951b8943e106066d5873e5 WatchSource:0}: Error finding container 5e9886394e141893b3037ece8eb0acc2ec437bfa2c951b8943e106066d5873e5: Status 404 returned error can't find the container with id 5e9886394e141893b3037ece8eb0acc2ec437bfa2c951b8943e106066d5873e5 Dec 04 17:27:19 crc kubenswrapper[4948]: W1204 17:27:19.584413 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7dc39c4_5e34_4d07_909f_85761440a108.slice/crio-ce03010963e67c9fc53b28929ba4307676772ae7231602ab7879e9f84414e655 WatchSource:0}: Error finding container ce03010963e67c9fc53b28929ba4307676772ae7231602ab7879e9f84414e655: Status 404 returned error can't find the container with id ce03010963e67c9fc53b28929ba4307676772ae7231602ab7879e9f84414e655 Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.602224 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n"] Dec 04 17:27:19 crc kubenswrapper[4948]: W1204 17:27:19.614848 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09680d2b_7d6e_4dcd_bf38_d4642fe27ac2.slice/crio-71972b1fe58a4c97f7b07fbb27725aa4937916f479ff192587f753e60c67f5c0 WatchSource:0}: Error finding container 71972b1fe58a4c97f7b07fbb27725aa4937916f479ff192587f753e60c67f5c0: Status 404 returned error can't find the container with id 71972b1fe58a4c97f7b07fbb27725aa4937916f479ff192587f753e60c67f5c0 Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.622075 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rz5j4"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.636198 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.651057 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.651528 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.151496532 +0000 UTC m=+51.512570934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: W1204 17:27:19.651587 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb106bf37_a37c_45a8_be6a_296e7288eb80.slice/crio-5179a72272ee9fb534708e24e6d6054c1bb8af6797d3fb87c365ab08d4f492ac WatchSource:0}: Error finding container 5179a72272ee9fb534708e24e6d6054c1bb8af6797d3fb87c365ab08d4f492ac: Status 404 returned error can't find the container with id 5179a72272ee9fb534708e24e6d6054c1bb8af6797d3fb87c365ab08d4f492ac Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.652948 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.653254 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.153247357 +0000 UTC m=+51.514321759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.677946 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7jtdw"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.753798 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.754471 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.254453768 +0000 UTC m=+51.615528170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: W1204 17:27:19.772499 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c3b0a5_59be_438e_a074_b3f5b154039e.slice/crio-6eb83cb13681af13e1247811480d81b7fd7790cab781ebfce9acfea979b615a8 WatchSource:0}: Error finding container 6eb83cb13681af13e1247811480d81b7fd7790cab781ebfce9acfea979b615a8: Status 404 returned error can't find the container with id 6eb83cb13681af13e1247811480d81b7fd7790cab781ebfce9acfea979b615a8 Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.807290 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.841398 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.855630 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.855973 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.355960676 +0000 UTC m=+51.717035078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.929638 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tsgwn"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.946577 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw"] Dec 04 17:27:19 crc kubenswrapper[4948]: I1204 17:27:19.956748 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:19 crc kubenswrapper[4948]: E1204 17:27:19.957160 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.457143045 +0000 UTC m=+51.818217447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.058530 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.058829 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.558815958 +0000 UTC m=+51.919890350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.086740 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4qwlk"] Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.159626 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.160079 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.660031038 +0000 UTC m=+52.021105450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.261526 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.261884 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.761866084 +0000 UTC m=+52.122940496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.363126 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.363326 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.86329312 +0000 UTC m=+52.224367562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.364786 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.365318 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.865298492 +0000 UTC m=+52.226372924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.471466 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.473575 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:20.973508914 +0000 UTC m=+52.334583356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.510423 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" event={"ID":"39ae03b6-0da8-43f7-84d2-300f5d0648af","Type":"ContainerStarted","Data":"1de308fd291145d205ca4cb5764822e4cf9e34c6ceef969ebcec255d629ef42a"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.512384 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xxt7h" event={"ID":"e7e78009-45e5-40d0-a208-d0996554a35e","Type":"ContainerStarted","Data":"c36ff026658a51a31eb2791f131e9b12f342d24b2f0c33e19971453ed2152a74"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.514308 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" event={"ID":"c5779a86-4384-4d85-8235-be7dfedc7c68","Type":"ContainerStarted","Data":"5e9886394e141893b3037ece8eb0acc2ec437bfa2c951b8943e106066d5873e5"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.516192 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zgswc" event={"ID":"720d0657-f05b-415e-a89b-cec265b15235","Type":"ContainerStarted","Data":"fbb8be8906491254f1ed8d188b5c84d079f671055f372ce3101283d59fc64c45"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.517892 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" event={"ID":"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2","Type":"ContainerStarted","Data":"71972b1fe58a4c97f7b07fbb27725aa4937916f479ff192587f753e60c67f5c0"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.519459 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" event={"ID":"cad76813-b0e7-4c9c-86e9-44d797f5dbb9","Type":"ContainerStarted","Data":"4f13afb2157b129fcb89d1d5bd75562a15dab8d4e97b76ccdf436ea868b3c675"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.521771 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m5k2z" event={"ID":"f80f2233-6a99-49c2-a8fc-1bb335b2dd79","Type":"ContainerStarted","Data":"08362ca2cf4b27503604d585ec31d395f8785554434340773d17fc0ad06d0a8f"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.523914 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" event={"ID":"b106bf37-a37c-45a8-be6a-296e7288eb80","Type":"ContainerStarted","Data":"5179a72272ee9fb534708e24e6d6054c1bb8af6797d3fb87c365ab08d4f492ac"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.527291 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" event={"ID":"f7dc39c4-5e34-4d07-909f-85761440a108","Type":"ContainerStarted","Data":"ce03010963e67c9fc53b28929ba4307676772ae7231602ab7879e9f84414e655"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.529824 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" event={"ID":"62821e25-9412-4650-a9e0-34f4fe49656b","Type":"ContainerStarted","Data":"21c21358c2662a628a6e9b2dd29795e1ba9a49d2e7bac9db4ebc7c6215473b7f"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.531178 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" event={"ID":"c0514d31-211c-4b78-b2a3-8536fe75604d","Type":"ContainerStarted","Data":"6ceee946f824ec71d38f67438cd0bffb0b03bc690a1af2dad5c527d1ab3fe9c8"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.532606 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" event={"ID":"532c984d-78f6-4e46-be62-53cb87748bcb","Type":"ContainerStarted","Data":"4972cd737cd3997295894a0de364c69a35fc81787a20ca180335aa9f83d61dfd"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.534459 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" event={"ID":"87c3b0a5-59be-438e-a074-b3f5b154039e","Type":"ContainerStarted","Data":"6eb83cb13681af13e1247811480d81b7fd7790cab781ebfce9acfea979b615a8"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.536032 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" event={"ID":"d8c015fa-00f0-4670-990a-e830b7762674","Type":"ContainerStarted","Data":"b57a50882b116f65cc7e87d4463b81efd6faf67e769a4f3b3454b15e0cbd3d5a"} Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.575692 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.576030 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.076013728 +0000 UTC m=+52.437088140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.676972 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.677334 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.177223257 +0000 UTC m=+52.538297719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.677467 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.677896 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.177874914 +0000 UTC m=+52.538949356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.778873 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.779092 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.279021072 +0000 UTC m=+52.640095514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.779616 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.779997 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.279974297 +0000 UTC m=+52.641048809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.882145 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.882372 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.382340947 +0000 UTC m=+52.743415359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.882716 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.883242 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.38320744 +0000 UTC m=+52.744281942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: W1204 17:27:20.950457 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod484ee778_914d_4c53_aa0e_6383472e1ebd.slice/crio-737fb552aa329c4af582c367a271c6d2e8e3dd9d17f6821312cd8b8a5f7ba116 WatchSource:0}: Error finding container 737fb552aa329c4af582c367a271c6d2e8e3dd9d17f6821312cd8b8a5f7ba116: Status 404 returned error can't find the container with id 737fb552aa329c4af582c367a271c6d2e8e3dd9d17f6821312cd8b8a5f7ba116 Dec 04 17:27:20 crc kubenswrapper[4948]: W1204 17:27:20.951011 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod702984bc_83a3_4da1_bd02_f8879e78502d.slice/crio-a81bba86c2fb25a63be051d86d5be788ba0c4d9461ec541645c0a2d6da266c18 WatchSource:0}: Error finding container a81bba86c2fb25a63be051d86d5be788ba0c4d9461ec541645c0a2d6da266c18: Status 404 returned error can't find the container with id a81bba86c2fb25a63be051d86d5be788ba0c4d9461ec541645c0a2d6da266c18 Dec 04 17:27:20 crc kubenswrapper[4948]: W1204 17:27:20.955445 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280ee280_d01c_4e3e_8390_69e6eb19a579.slice/crio-c500f34b0f45c3fd5c866030ddafc54df211ad1b4c4e1a75f22b360d6ae2377d WatchSource:0}: Error finding container c500f34b0f45c3fd5c866030ddafc54df211ad1b4c4e1a75f22b360d6ae2377d: Status 404 returned error can't find the container with id c500f34b0f45c3fd5c866030ddafc54df211ad1b4c4e1a75f22b360d6ae2377d Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.956831 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw"] Dec 04 17:27:20 crc kubenswrapper[4948]: W1204 17:27:20.962823 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96f5544e_1a2b_4d58_9d9c_799509953821.slice/crio-5d71817fc05bdd7cf45dc9146557c6dafbf5814fcf4771e3f28a00f5be69e5ee WatchSource:0}: Error finding container 5d71817fc05bdd7cf45dc9146557c6dafbf5814fcf4771e3f28a00f5be69e5ee: Status 404 returned error can't find the container with id 5d71817fc05bdd7cf45dc9146557c6dafbf5814fcf4771e3f28a00f5be69e5ee Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.966795 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4"] Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.968867 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj"] Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.983595 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.983764 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.483736982 +0000 UTC m=+52.844811384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:20 crc kubenswrapper[4948]: I1204 17:27:20.983896 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:20 crc kubenswrapper[4948]: E1204 17:27:20.984266 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.484251906 +0000 UTC m=+52.845326318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: W1204 17:27:21.025487 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ef18d3_fb9b_46f0_82a0_4db3172f43a7.slice/crio-0f7d837988f406dbf5200f2498343a2af24441cb044ff1d68d5910aadc67e968 WatchSource:0}: Error finding container 0f7d837988f406dbf5200f2498343a2af24441cb044ff1d68d5910aadc67e968: Status 404 returned error can't find the container with id 0f7d837988f406dbf5200f2498343a2af24441cb044ff1d68d5910aadc67e968 Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.086684 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.086842 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.586815391 +0000 UTC m=+52.947889803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.087068 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.087416 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.587404226 +0000 UTC m=+52.948478628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.187614 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.187788 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.687750144 +0000 UTC m=+53.048824546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.187917 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.188673 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.688658498 +0000 UTC m=+53.049732900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.288876 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.289206 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.78918924 +0000 UTC m=+53.150263632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.289460 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.289756 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.789749265 +0000 UTC m=+53.150823667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.390557 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.390971 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.890952355 +0000 UTC m=+53.252026767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.493009 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.494845 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:21.994815704 +0000 UTC m=+53.355890106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.505765 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.507318 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.509640 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8k5hb"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.511503 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.541150 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.541185 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" event={"ID":"09e20532-f709-4854-82c2-7b84e2d62950","Type":"ContainerStarted","Data":"09bd41be8ab7bb850befa2c061a6b045e7f2a5269419525266804151957c3534"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.544156 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" event={"ID":"484ee778-914d-4c53-aa0e-6383472e1ebd","Type":"ContainerStarted","Data":"737fb552aa329c4af582c367a271c6d2e8e3dd9d17f6821312cd8b8a5f7ba116"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.546079 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" event={"ID":"280ee280-d01c-4e3e-8390-69e6eb19a579","Type":"ContainerStarted","Data":"c500f34b0f45c3fd5c866030ddafc54df211ad1b4c4e1a75f22b360d6ae2377d"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.546781 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" event={"ID":"7c394cd7-6d7c-4880-911d-cc27cc380a17","Type":"ContainerStarted","Data":"902b9b1a00c6e4fbb231e88e8f97afac589ec28306814d6a2ae1231e310a5f6c"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.547454 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" event={"ID":"4d1c5f04-f0c8-4865-bdba-4347d9840bfb","Type":"ContainerStarted","Data":"7970b881c8c3ee8cfa4bc731b60b69f7c1cce144e4dd4175af5c3ed83992b0dc"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.548368 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" event={"ID":"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec","Type":"ContainerStarted","Data":"27685e90a3c4eb94f2c00f18dcc59cd35b1a489674de7336487bf7d966c66c51"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.548943 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" event={"ID":"702984bc-83a3-4da1-bd02-f8879e78502d","Type":"ContainerStarted","Data":"a81bba86c2fb25a63be051d86d5be788ba0c4d9461ec541645c0a2d6da266c18"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.549482 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" event={"ID":"96f5544e-1a2b-4d58-9d9c-799509953821","Type":"ContainerStarted","Data":"5d71817fc05bdd7cf45dc9146557c6dafbf5814fcf4771e3f28a00f5be69e5ee"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.549968 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7jtdw" event={"ID":"c3556602-2a66-48fb-a187-85849f5c08e4","Type":"ContainerStarted","Data":"d9afec1df0c850a36a86f2b9447b6116a4cee395268a371c05a1148cbfca5186"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.550497 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" event={"ID":"c5237623-6755-491c-8345-90f85db04335","Type":"ContainerStarted","Data":"290b9bcf7646f5fe5e8877d96eefa3d283a4e4645117da41d0fc16669521408e"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.551209 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" event={"ID":"03ef18d3-fb9b-46f0-82a0-4db3172f43a7","Type":"ContainerStarted","Data":"0f7d837988f406dbf5200f2498343a2af24441cb044ff1d68d5910aadc67e968"} Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.594088 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.594589 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.094562157 +0000 UTC m=+53.455636569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.695617 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.696177 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.196148527 +0000 UTC m=+53.557222969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.761845 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.767017 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.772867 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.777694 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tqbdj"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.779887 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hbqk5"] Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.798357 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.798519 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.298497626 +0000 UTC m=+53.659572028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.799734 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.800893 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.300812216 +0000 UTC m=+53.661886658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:21 crc kubenswrapper[4948]: I1204 17:27:21.901130 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:21 crc kubenswrapper[4948]: E1204 17:27:21.901436 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.401421781 +0000 UTC m=+53.762496183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.002035 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.002144 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.002548 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.003418 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.503398591 +0000 UTC m=+53.864473013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.019368 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.025154 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:22 crc kubenswrapper[4948]: W1204 17:27:22.030166 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97da035d_1f0b_4a53_bc43_04b3a495eda9.slice/crio-da625ea99c1b191014a2f1d00508d804bf5ff7c2bd656ad12906244496b98b74 WatchSource:0}: Error finding container da625ea99c1b191014a2f1d00508d804bf5ff7c2bd656ad12906244496b98b74: Status 404 returned error can't find the container with id da625ea99c1b191014a2f1d00508d804bf5ff7c2bd656ad12906244496b98b74 Dec 04 17:27:22 crc kubenswrapper[4948]: W1204 17:27:22.075439 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e9c7581_db86_4c8a_9692_3fcf07b99c42.slice/crio-060fc3226c80789f202e88e9bdecc34c711bf7a1ff70aaf01ca9716dd04db4cf WatchSource:0}: Error finding container 060fc3226c80789f202e88e9bdecc34c711bf7a1ff70aaf01ca9716dd04db4cf: Status 404 returned error can't find the container with id 060fc3226c80789f202e88e9bdecc34c711bf7a1ff70aaf01ca9716dd04db4cf Dec 04 17:27:22 crc kubenswrapper[4948]: W1204 17:27:22.081111 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e200fe3_fcc4_4b69_9937_6a5ea6233cdf.slice/crio-3328c8b8d41fdb661d5f2ad81e8372fe5908587cbbbb1709ca2e625ec692483d WatchSource:0}: Error finding container 3328c8b8d41fdb661d5f2ad81e8372fe5908587cbbbb1709ca2e625ec692483d: Status 404 returned error can't find the container with id 3328c8b8d41fdb661d5f2ad81e8372fe5908587cbbbb1709ca2e625ec692483d Dec 04 17:27:22 crc kubenswrapper[4948]: W1204 17:27:22.083905 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7650dc5_9e1f_49e4_98f8_45836883f728.slice/crio-95b9b7bbee4bf9989d84bb6223db5d56596098c78a7527c06e92aad5bd6bf725 WatchSource:0}: Error finding container 95b9b7bbee4bf9989d84bb6223db5d56596098c78a7527c06e92aad5bd6bf725: Status 404 returned error can't find the container with id 95b9b7bbee4bf9989d84bb6223db5d56596098c78a7527c06e92aad5bd6bf725 Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.103649 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.103895 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.603874612 +0000 UTC m=+53.964949024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.103925 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.103996 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.104033 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.105158 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.605145235 +0000 UTC m=+53.966219637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.107498 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.109820 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.165259 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.176558 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.193907 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.205099 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.205441 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.705426662 +0000 UTC m=+54.066501064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.307061 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.307401 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.807386831 +0000 UTC m=+54.168461233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.408268 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.408656 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.908632282 +0000 UTC m=+54.269706684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.408843 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.409170 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:22.909155636 +0000 UTC m=+54.270230048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.513193 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.513376 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.013359864 +0000 UTC m=+54.374434256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.513477 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.513861 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.013851847 +0000 UTC m=+54.374926249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.559030 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" event={"ID":"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1","Type":"ContainerStarted","Data":"3873f830cae5444cc9ae0c92e223006ffee5bd177746ee7b770e593f378bc141"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.562019 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" event={"ID":"c9126688-8fd4-46db-8188-dc8014777a8d","Type":"ContainerStarted","Data":"f8737b295e3284c3a63a64597a74490c9b2e185f58b86b779b4e4f8e541e156e"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.562915 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" event={"ID":"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf","Type":"ContainerStarted","Data":"3328c8b8d41fdb661d5f2ad81e8372fe5908587cbbbb1709ca2e625ec692483d"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.563748 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" event={"ID":"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72","Type":"ContainerStarted","Data":"6268dbb92ba47746434fb8b99e38af4f7440a21e016d95183fec9b12dbe4647a"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.564869 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" event={"ID":"771c1e0f-69a0-4bf2-8345-37ed755de8ff","Type":"ContainerStarted","Data":"92fba8e6e64c770e364960949f60a9af9e32f6354376c16ae71909d7f0aa34f3"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.565641 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" event={"ID":"97da035d-1f0b-4a53-bc43-04b3a495eda9","Type":"ContainerStarted","Data":"da625ea99c1b191014a2f1d00508d804bf5ff7c2bd656ad12906244496b98b74"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.566527 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tqbdj" event={"ID":"218eaa87-4b22-4db2-8ff0-174995db7128","Type":"ContainerStarted","Data":"e49b661165b2ce92c3c167db5ab2d4a928445a39781da5ef00583003b01f3bfd"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.567420 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" event={"ID":"b7650dc5-9e1f-49e4-98f8-45836883f728","Type":"ContainerStarted","Data":"95b9b7bbee4bf9989d84bb6223db5d56596098c78a7527c06e92aad5bd6bf725"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.568363 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xfvmf" event={"ID":"6f1b9652-58ed-4708-8cae-58cf5b66d439","Type":"ContainerStarted","Data":"10c28c1ac7eb2057e93d56ec5959c06288a868e3d7fa650f5912d73c53680e87"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.569018 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8k5hb" event={"ID":"090a1667-3d12-491d-96d4-3efddf82b503","Type":"ContainerStarted","Data":"d59f813cfd1e5cd5875ca590de8f196065069bdba315028241f7358e5f5e0a49"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.569908 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" event={"ID":"9e9c7581-db86-4c8a-9692-3fcf07b99c42","Type":"ContainerStarted","Data":"060fc3226c80789f202e88e9bdecc34c711bf7a1ff70aaf01ca9716dd04db4cf"} Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.614392 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.614679 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.114662216 +0000 UTC m=+54.475736618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.716163 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.716575 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.216559375 +0000 UTC m=+54.577633777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: W1204 17:27:22.793080 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-251fe3c36d4a99fb646d14fd9544602f085caf379ee4bcff462bbd9ae72ce15a WatchSource:0}: Error finding container 251fe3c36d4a99fb646d14fd9544602f085caf379ee4bcff462bbd9ae72ce15a: Status 404 returned error can't find the container with id 251fe3c36d4a99fb646d14fd9544602f085caf379ee4bcff462bbd9ae72ce15a Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.817826 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.818075 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.318031672 +0000 UTC m=+54.679106094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.818531 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.818927 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.318912224 +0000 UTC m=+54.679986636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:22 crc kubenswrapper[4948]: W1204 17:27:22.910681 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-f2225051c442a4790759f07710293bdee7b22da8bca3c78f76b78fa625480176 WatchSource:0}: Error finding container f2225051c442a4790759f07710293bdee7b22da8bca3c78f76b78fa625480176: Status 404 returned error can't find the container with id f2225051c442a4790759f07710293bdee7b22da8bca3c78f76b78fa625480176 Dec 04 17:27:22 crc kubenswrapper[4948]: I1204 17:27:22.919074 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:22 crc kubenswrapper[4948]: E1204 17:27:22.919379 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.419365005 +0000 UTC m=+54.780439407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.020908 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.021585 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.521575101 +0000 UTC m=+54.882649503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.122912 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.123210 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.623182042 +0000 UTC m=+54.984256434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.224035 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.224486 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.724471314 +0000 UTC m=+55.085545736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.325420 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.325664 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.825629763 +0000 UTC m=+55.186704205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.325919 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.326190 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.826179957 +0000 UTC m=+55.187254359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.427433 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.427878 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.92785435 +0000 UTC m=+55.288928772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.428400 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.428804 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:23.928784444 +0000 UTC m=+55.289858856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.529932 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.530159 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.030135198 +0000 UTC m=+55.391209600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.530396 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.531070 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.031024661 +0000 UTC m=+55.392099083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.575776 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"251fe3c36d4a99fb646d14fd9544602f085caf379ee4bcff462bbd9ae72ce15a"} Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.576887 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4e84383b2e5c9a896a492404befe6caca5b09e4ca2dd85812d7c3b7e7758a254"} Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.577734 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f2225051c442a4790759f07710293bdee7b22da8bca3c78f76b78fa625480176"} Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.579468 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-trshb" event={"ID":"357c70a4-c799-43ba-8d28-ca99269d41fc","Type":"ContainerDied","Data":"35f44f663dc74e33a83a3f093cb0324899a62f01adf3ab02a566c50fb1697ec2"} Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.579415 4948 generic.go:334] "Generic (PLEG): container finished" podID="357c70a4-c799-43ba-8d28-ca99269d41fc" containerID="35f44f663dc74e33a83a3f093cb0324899a62f01adf3ab02a566c50fb1697ec2" exitCode=0 Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.634236 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.634471 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.134439238 +0000 UTC m=+55.495513680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.634548 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.634886 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.134867869 +0000 UTC m=+55.495942281 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.736531 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.736705 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.236682605 +0000 UTC m=+55.597757007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.736843 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.737245 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.23723489 +0000 UTC m=+55.598309292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.838431 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.838622 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.338594984 +0000 UTC m=+55.699669386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.838721 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.839154 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.339144418 +0000 UTC m=+55.700218890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:23 crc kubenswrapper[4948]: I1204 17:27:23.939699 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:23 crc kubenswrapper[4948]: E1204 17:27:23.940139 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.440120372 +0000 UTC m=+55.801194774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.041826 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.042246 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.542232646 +0000 UTC m=+55.903307048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.142501 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.142653 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.642630165 +0000 UTC m=+56.003704577 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.142807 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.143127 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.643116618 +0000 UTC m=+56.004191020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.212734 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.227211 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.243948 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.244099 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.74407643 +0000 UTC m=+56.105150842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.244238 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.244554 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.744545193 +0000 UTC m=+56.105619595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.345084 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.345198 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.845176178 +0000 UTC m=+56.206250580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.345419 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.345747 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.845736902 +0000 UTC m=+56.206811304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.446422 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.446566 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.946547442 +0000 UTC m=+56.307621854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.446643 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.446906 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:24.946893451 +0000 UTC m=+56.307967853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.548081 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.548218 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.048201894 +0000 UTC m=+56.409276296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.548410 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.548628 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.048621325 +0000 UTC m=+56.409695727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.588465 4948 generic.go:334] "Generic (PLEG): container finished" podID="f7dc39c4-5e34-4d07-909f-85761440a108" containerID="be75631bc8d42c48565f4953c9ccb049089c5d0adb084d80cf93409a73c673f2" exitCode=0 Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.588527 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" event={"ID":"f7dc39c4-5e34-4d07-909f-85761440a108","Type":"ContainerDied","Data":"be75631bc8d42c48565f4953c9ccb049089c5d0adb084d80cf93409a73c673f2"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.590936 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" event={"ID":"4fbe65a5-28e7-40db-85f7-66d00806dcbe","Type":"ContainerStarted","Data":"8c6e20a0219d55864ae666002d69e100cd730c8b4a8f482017eab16370609abc"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.593062 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m5k2z" event={"ID":"f80f2233-6a99-49c2-a8fc-1bb335b2dd79","Type":"ContainerStarted","Data":"7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.595267 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" event={"ID":"4005cfa7-7eda-43d9-ba7f-fe06d42c82d2","Type":"ContainerStarted","Data":"aa709e092a8b9ac19b6e5360d94aa012d54121b921ef4783c68e8a19d6e7812b"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.597100 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" event={"ID":"c5779a86-4384-4d85-8235-be7dfedc7c68","Type":"ContainerStarted","Data":"26d96fbcf9409851a5ba251ff284d9475cb480a0acfce6573bf59b9e93d4f7d5"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.601956 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" event={"ID":"532c984d-78f6-4e46-be62-53cb87748bcb","Type":"ContainerStarted","Data":"ebb8fcb8b34b4f3bc6df813dfdb9ee4af5d98a548bb0a872430dc7557828ca40"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.617058 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" event={"ID":"de05452b-7cdf-44da-a351-b21ba3691f41","Type":"ContainerStarted","Data":"399618c6b2d5cd3cd740da4e3b1812d30f373ff9e033a1aea89b2170b407e507"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.621115 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" event={"ID":"d8c015fa-00f0-4670-990a-e830b7762674","Type":"ContainerStarted","Data":"d0281343f3ee3a77e4c9ea995bf7f35fe1fe73b457028d7c60af213d1d5536ec"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.636571 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" event={"ID":"c0514d31-211c-4b78-b2a3-8536fe75604d","Type":"ContainerStarted","Data":"2b7e42820235af1800f77a523e9edb5d4d5bf9eaa3d0eaaeb4f50294d19f4dfe"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.650628 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.651432 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.151408226 +0000 UTC m=+56.512482648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.662621 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xxt7h" event={"ID":"e7e78009-45e5-40d0-a208-d0996554a35e","Type":"ContainerStarted","Data":"2fd80cdb0458085d908845ea0fa3ccbc5b1ad740d7aedaf4dcd34a7c7c62a704"} Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.696840 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-npdsj" podStartSLOduration=33.696825002 podStartE2EDuration="33.696825002s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:24.695281832 +0000 UTC m=+56.056356234" watchObservedRunningTime="2025-12-04 17:27:24.696825002 +0000 UTC m=+56.057899404" Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.736249 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.736203012 podStartE2EDuration="736.203012ms" podCreationTimestamp="2025-12-04 17:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:24.73265481 +0000 UTC m=+56.093729222" watchObservedRunningTime="2025-12-04 17:27:24.736203012 +0000 UTC m=+56.097277414" Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.753018 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.754660 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.254644299 +0000 UTC m=+56.615718701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.861776 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.862521 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.362500321 +0000 UTC m=+56.723574723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:24 crc kubenswrapper[4948]: I1204 17:27:24.965037 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:24 crc kubenswrapper[4948]: E1204 17:27:24.965498 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.465479157 +0000 UTC m=+56.826553559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.066416 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.066803 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.56676279 +0000 UTC m=+56.927837192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.066965 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.067461 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.567449587 +0000 UTC m=+56.928523989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.169226 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.170163 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.670136376 +0000 UTC m=+57.031210778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.270949 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.271400 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.771386197 +0000 UTC m=+57.132460599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.373871 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.374583 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.874564929 +0000 UTC m=+57.235639341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.475924 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.476385 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:25.976363684 +0000 UTC m=+57.337438086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.577704 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.577941 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.077910953 +0000 UTC m=+57.438985355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.578204 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.578546 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.078536339 +0000 UTC m=+57.439610741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.679587 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.679773 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.17974825 +0000 UTC m=+57.540822652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.679859 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.680145 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.18013788 +0000 UTC m=+57.541212282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.733729 4948 generic.go:334] "Generic (PLEG): container finished" podID="484ee778-914d-4c53-aa0e-6383472e1ebd" containerID="e6d86fc44811715ac83cb9bbda6376b8e97b6a3526576bcf26bb31d30e9001ba" exitCode=0 Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.733793 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" event={"ID":"484ee778-914d-4c53-aa0e-6383472e1ebd","Type":"ContainerDied","Data":"e6d86fc44811715ac83cb9bbda6376b8e97b6a3526576bcf26bb31d30e9001ba"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.769307 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" event={"ID":"e68ac3a8-6fca-4ab1-bd87-ec6cdfe791e1","Type":"ContainerStarted","Data":"d6ed92e9726adb5f3b558ffff9065f0867d72408d20a06f7c395c5da8b4d596b"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.777432 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" event={"ID":"87c3b0a5-59be-438e-a074-b3f5b154039e","Type":"ContainerStarted","Data":"f8519a8672fa1d122b8b686d49f7538dae84f11800854d1ae201daa6f1eb5360"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.782488 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.783671 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.28365602 +0000 UTC m=+57.644730422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.803358 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xgll2" podStartSLOduration=33.803319219 podStartE2EDuration="33.803319219s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:25.795318912 +0000 UTC m=+57.156393314" watchObservedRunningTime="2025-12-04 17:27:25.803319219 +0000 UTC m=+57.164393621" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.813109 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"eb1d8b83073eb53ff10a29fa9d7956e7636f270e0f2b0a2903216b60f93b5d94"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.830339 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" event={"ID":"62821e25-9412-4650-a9e0-34f4fe49656b","Type":"ContainerStarted","Data":"813548feb85ed86684be112b00d9e592abdc413274bf21d3e2532a759e46104b"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.831137 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.864187 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" event={"ID":"39ae03b6-0da8-43f7-84d2-300f5d0648af","Type":"ContainerStarted","Data":"92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.864425 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.865756 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bvs4t" podStartSLOduration=33.865735325 podStartE2EDuration="33.865735325s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:25.838631673 +0000 UTC m=+57.199706085" watchObservedRunningTime="2025-12-04 17:27:25.865735325 +0000 UTC m=+57.226809727" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.885208 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" event={"ID":"c5779a86-4384-4d85-8235-be7dfedc7c68","Type":"ContainerStarted","Data":"7e1a78f874a238d4155cce9818381b80bafc2d522fa04264afc91baeac863c71"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.885402 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.902999 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.402975259 +0000 UTC m=+57.764049661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.932204 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" event={"ID":"b6df6df8-563e-4d8a-b9e5-29250531a399","Type":"ContainerStarted","Data":"5c369250e8e71ba4997284c63c59eb5ecc20a6a141729d6682449afd88ef7734"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.932260 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" event={"ID":"b6df6df8-563e-4d8a-b9e5-29250531a399","Type":"ContainerStarted","Data":"d7e2f077b55620327d7bb52ea24b5fc17e7634a9db9e3ecc936a17a56b4965e8"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.932872 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" podStartSLOduration=34.932850452 podStartE2EDuration="34.932850452s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:25.90456755 +0000 UTC m=+57.265641952" watchObservedRunningTime="2025-12-04 17:27:25.932850452 +0000 UTC m=+57.293924854" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.947156 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podStartSLOduration=10.947138332 podStartE2EDuration="10.947138332s" podCreationTimestamp="2025-12-04 17:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:25.924686801 +0000 UTC m=+57.285761203" watchObservedRunningTime="2025-12-04 17:27:25.947138332 +0000 UTC m=+57.308212734" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.958911 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"adeb74b4122c7652d17e63d505ff821dbbe9e3a78569dd50bebd61ca5b101ffe"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.970719 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.983460 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fd0aace14b8253d052836d719bfb392cf97281b45406d39ea90226e196ff7c84"} Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.984083 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mmxqn" podStartSLOduration=33.984031957 podStartE2EDuration="33.984031957s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:25.960310043 +0000 UTC m=+57.321384445" watchObservedRunningTime="2025-12-04 17:27:25.984031957 +0000 UTC m=+57.345106359" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.985068 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:27:25 crc kubenswrapper[4948]: I1204 17:27:25.988532 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:25 crc kubenswrapper[4948]: E1204 17:27:25.990086 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.490039193 +0000 UTC m=+57.851113595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.018348 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" event={"ID":"cad76813-b0e7-4c9c-86e9-44d797f5dbb9","Type":"ContainerStarted","Data":"8fca269ec606ff95a504d12038c04488edeaf640be2bd7c43d146294d142ffea"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.019150 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.022378 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" event={"ID":"9e9c7581-db86-4c8a-9692-3fcf07b99c42","Type":"ContainerStarted","Data":"0771a8a1fe46df15c360abc30ec21355fb034ff14da4ec0491d16390c5d45fd6"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.063211 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" event={"ID":"b106bf37-a37c-45a8-be6a-296e7288eb80","Type":"ContainerStarted","Data":"3718886cb896417eabf0e1f06598b47e8f4bda18baa208d5e89e0e918d6486f5"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.078756 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" event={"ID":"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72","Type":"ContainerStarted","Data":"252d6753efddffcdb113b2fd66300c347e9950deb445870a9392fe893823c1a8"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.078807 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" event={"ID":"9dc01e5d-2b5d-4aa8-9c64-7e1b7242ab72","Type":"ContainerStarted","Data":"0351eee3199bea796a66cd7954940368b5d41d354fd0e8d354511fc1cc512e77"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.081570 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.094322 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.096012 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.595993466 +0000 UTC m=+57.957067958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.110345 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" event={"ID":"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2","Type":"ContainerStarted","Data":"48bca2256b497906ce197dda6fcb8cd22cd116b9dbacb84109591a4125e53376"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.112972 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" podStartSLOduration=34.112956435 podStartE2EDuration="34.112956435s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.112562175 +0000 UTC m=+57.473636567" watchObservedRunningTime="2025-12-04 17:27:26.112956435 +0000 UTC m=+57.474030837" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.113165 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" podStartSLOduration=35.113161931 podStartE2EDuration="35.113161931s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.0347203 +0000 UTC m=+57.395794712" watchObservedRunningTime="2025-12-04 17:27:26.113161931 +0000 UTC m=+57.474236333" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.132370 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" event={"ID":"96f5544e-1a2b-4d58-9d9c-799509953821","Type":"ContainerStarted","Data":"1e951b22f0743cfad7100cf6cf22e4074e96e668b297d98798d42961dd9468a1"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.134448 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" event={"ID":"4d1c5f04-f0c8-4865-bdba-4347d9840bfb","Type":"ContainerStarted","Data":"6eca8189eff60bcab3c23455a1036361531344fe75ab207484374815509d0a27"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.166422 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vnv9b" podStartSLOduration=34.166404509 podStartE2EDuration="34.166404509s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.164414038 +0000 UTC m=+57.525488450" watchObservedRunningTime="2025-12-04 17:27:26.166404509 +0000 UTC m=+57.527478911" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.167344 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" event={"ID":"c5237623-6755-491c-8345-90f85db04335","Type":"ContainerStarted","Data":"3800e276442c7b0e694c586ff9f19c255167eda8b8e2033672a685597d878aeb"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.199902 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.200024 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.700004579 +0000 UTC m=+58.061078981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.200726 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.203196 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.703183861 +0000 UTC m=+58.064258263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.203581 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.207879 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" event={"ID":"7c394cd7-6d7c-4880-911d-cc27cc380a17","Type":"ContainerStarted","Data":"38e6925771438fdb3b76e9bcb90fb71c74deb413d0dbc25ba75a3c66a7ee7688"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.207917 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.236864 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tqbdj" event={"ID":"218eaa87-4b22-4db2-8ff0-174995db7128","Type":"ContainerStarted","Data":"8b4d7dbcee59ac5e3aab666e9897598c793b1852f2e5bbd4c6313390c87a9c49"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.237847 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.244672 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-d5zf7" podStartSLOduration=34.244641965 podStartE2EDuration="34.244641965s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.240953939 +0000 UTC m=+57.602028341" watchObservedRunningTime="2025-12-04 17:27:26.244641965 +0000 UTC m=+57.605716367" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.255549 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.256533 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-zgswc" event={"ID":"720d0657-f05b-415e-a89b-cec265b15235","Type":"ContainerStarted","Data":"7971f168dff3c0ef4dc4bb4a60844589110cf9a39e40c8736c4f8ea07193ef1a"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.282381 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7jtdw" event={"ID":"c3556602-2a66-48fb-a187-85849f5c08e4","Type":"ContainerStarted","Data":"6ac713e31d69bd0bb8125df806f0a351fa1e361e2ec834c124a26f7d1209845f"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.283409 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7jtdw" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.286076 4948 patch_prober.go:28] interesting pod/downloads-7954f5f757-7jtdw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.286116 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7jtdw" podUID="c3556602-2a66-48fb-a187-85849f5c08e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.301460 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.302854 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.802827991 +0000 UTC m=+58.163902443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.303559 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" event={"ID":"a4ce65b4-9c33-4639-83a7-49a6c1e4b9ec","Type":"ContainerStarted","Data":"1dae98d9f8387b01aa0791ec4d0e7490c53270e1c0252bd22960674cf697d54f"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.325237 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" podStartSLOduration=35.325212651 podStartE2EDuration="35.325212651s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.322745467 +0000 UTC m=+57.683819889" watchObservedRunningTime="2025-12-04 17:27:26.325212651 +0000 UTC m=+57.686287053" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.349354 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" event={"ID":"03ef18d3-fb9b-46f0-82a0-4db3172f43a7","Type":"ContainerStarted","Data":"8ed53096b009d775bda39efcb65e85a12984eb198a19fbf581505d5beb537cfc"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.349407 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" event={"ID":"03ef18d3-fb9b-46f0-82a0-4db3172f43a7","Type":"ContainerStarted","Data":"9122cdd2cf2f0e0fc5c8098aea01f2b7019533f4adb295bdbef5fe394a699522"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.350127 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.381509 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" event={"ID":"280ee280-d01c-4e3e-8390-69e6eb19a579","Type":"ContainerStarted","Data":"da25a2367be2b24ec60c67e9f4ef1b66f30b33fd20a9eb35bab52402468dd5de"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.383167 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.390479 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p457n" podStartSLOduration=34.39045687 podStartE2EDuration="34.39045687s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.388817457 +0000 UTC m=+57.749891869" watchObservedRunningTime="2025-12-04 17:27:26.39045687 +0000 UTC m=+57.751531262" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.409095 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.409403 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:26.90939189 +0000 UTC m=+58.270466292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.411992 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" event={"ID":"1fb6542e-ebb3-4df7-95d3-7c6c55fcd845","Type":"ContainerStarted","Data":"a87d56c07a9aa2adaa136482fd0d46c52b294958ac941e53c9b12cef5244f6f8"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.437437 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.451376 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" event={"ID":"4fbe65a5-28e7-40db-85f7-66d00806dcbe","Type":"ContainerStarted","Data":"405ff872249a899b6c39c46af3f01102e37666050fc2c66298eaf34f3d27c970"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.455563 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4qwlk" podStartSLOduration=35.455527254 podStartE2EDuration="35.455527254s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.443034401 +0000 UTC m=+57.804108813" watchObservedRunningTime="2025-12-04 17:27:26.455527254 +0000 UTC m=+57.816601656" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.476742 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8k5hb" event={"ID":"090a1667-3d12-491d-96d4-3efddf82b503","Type":"ContainerStarted","Data":"3e24798ebebea387040626a181e8536540f6b77556a5a595032b41a9b937071b"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.497553 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" event={"ID":"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf","Type":"ContainerStarted","Data":"9a3935c90d9b28eae221025b6929ffc89ee5b1e51b6801c8da2ecd6cf10b3db1"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.498660 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.511253 4948 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hbqk5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.511646 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" podUID="5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.511941 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.513595 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.013577257 +0000 UTC m=+58.374651659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.515791 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" event={"ID":"702984bc-83a3-4da1-bd02-f8879e78502d","Type":"ContainerStarted","Data":"1796ae814600a948a208a111d68114ac6381992376b853c84c0d4f878894e203"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.516268 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-h58bm"] Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.543985 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-zgswc" podStartSLOduration=34.543970634 podStartE2EDuration="34.543970634s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.542783803 +0000 UTC m=+57.903858205" watchObservedRunningTime="2025-12-04 17:27:26.543970634 +0000 UTC m=+57.905045036" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.545070 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" event={"ID":"97da035d-1f0b-4a53-bc43-04b3a495eda9","Type":"ContainerStarted","Data":"be8b0126f23c50c6d2c2ef4dc838a3137f5d09287599c9412f73e1fd06e94b09"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.545901 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.565401 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-xqmtt" podStartSLOduration=34.565381298 podStartE2EDuration="34.565381298s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.563545871 +0000 UTC m=+57.924620273" watchObservedRunningTime="2025-12-04 17:27:26.565381298 +0000 UTC m=+57.926455700" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.577436 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" event={"ID":"b7650dc5-9e1f-49e4-98f8-45836883f728","Type":"ContainerStarted","Data":"2aeb6d17931a93233d7ec5463283a4800d04a796cbb145eac5c1a6b29ef5b115"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.577713 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" event={"ID":"b7650dc5-9e1f-49e4-98f8-45836883f728","Type":"ContainerStarted","Data":"e9b019209c53185c61d833aeeeb92d645f13155c9aba32367f0f90db5d8c8d94"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.609147 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" event={"ID":"c9126688-8fd4-46db-8188-dc8014777a8d","Type":"ContainerStarted","Data":"30179a5e350cc9565913131c635acd6864483a9f1d1d5041ff6d95c42597b5c1"} Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.612583 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.612917 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cmmdj" podStartSLOduration=34.612897659 podStartE2EDuration="34.612897659s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.612826047 +0000 UTC m=+57.973900449" watchObservedRunningTime="2025-12-04 17:27:26.612897659 +0000 UTC m=+57.973972061" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.613253 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.614641 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.616163 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.116150193 +0000 UTC m=+58.477224595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.660483 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xfvmf" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.661084 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.743304 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.744419 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.244381783 +0000 UTC m=+58.605456185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.756259 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.782707 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" podStartSLOduration=34.756100826 podStartE2EDuration="34.756100826s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.703990827 +0000 UTC m=+58.065065239" watchObservedRunningTime="2025-12-04 17:27:26.756100826 +0000 UTC m=+58.117175238" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.845805 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.846343 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.346331302 +0000 UTC m=+58.707405704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.926338 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kdgm4" podStartSLOduration=34.926323883 podStartE2EDuration="34.926323883s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.924356742 +0000 UTC m=+58.285431144" watchObservedRunningTime="2025-12-04 17:27:26.926323883 +0000 UTC m=+58.287398285" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.926853 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-kd2hw" podStartSLOduration=34.926849607 podStartE2EDuration="34.926849607s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.835396249 +0000 UTC m=+58.196470641" watchObservedRunningTime="2025-12-04 17:27:26.926849607 +0000 UTC m=+58.287924009" Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.948444 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.948659 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.44860782 +0000 UTC m=+58.809682222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:26 crc kubenswrapper[4948]: I1204 17:27:26.948692 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:26 crc kubenswrapper[4948]: E1204 17:27:26.949031 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.449019101 +0000 UTC m=+58.810093503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.024120 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tqbdj" podStartSLOduration=12.024097775 podStartE2EDuration="12.024097775s" podCreationTimestamp="2025-12-04 17:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:26.980779603 +0000 UTC m=+58.341853995" watchObservedRunningTime="2025-12-04 17:27:27.024097775 +0000 UTC m=+58.385172177" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.025602 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7jtdw" podStartSLOduration=36.025595023 podStartE2EDuration="36.025595023s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.025411259 +0000 UTC m=+58.386485651" watchObservedRunningTime="2025-12-04 17:27:27.025595023 +0000 UTC m=+58.386669425" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.051552 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.051839 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.551815952 +0000 UTC m=+58.912890354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.082158 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ltp95" podStartSLOduration=35.082138037 podStartE2EDuration="35.082138037s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.078418811 +0000 UTC m=+58.439493213" watchObservedRunningTime="2025-12-04 17:27:27.082138037 +0000 UTC m=+58.443212449" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.090798 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.097910 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:27 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:27 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:27 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.097969 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.157269 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.157847 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.657692483 +0000 UTC m=+59.018766885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.236711 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ggrb4" podStartSLOduration=35.236689588 podStartE2EDuration="35.236689588s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.233897476 +0000 UTC m=+58.594971878" watchObservedRunningTime="2025-12-04 17:27:27.236689588 +0000 UTC m=+58.597763990" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.258438 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.258905 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.758887823 +0000 UTC m=+59.119962225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.262462 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-m5k2z" podStartSLOduration=36.262442215 podStartE2EDuration="36.262442215s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.260802863 +0000 UTC m=+58.621877265" watchObservedRunningTime="2025-12-04 17:27:27.262442215 +0000 UTC m=+58.623516617" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.317644 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-595fv" podStartSLOduration=35.317628054 podStartE2EDuration="35.317628054s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.301740803 +0000 UTC m=+58.662815215" watchObservedRunningTime="2025-12-04 17:27:27.317628054 +0000 UTC m=+58.678702456" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.348297 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" podStartSLOduration=35.348277067 podStartE2EDuration="35.348277067s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.344492699 +0000 UTC m=+58.705567101" watchObservedRunningTime="2025-12-04 17:27:27.348277067 +0000 UTC m=+58.709351469" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.360008 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.360408 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.860393711 +0000 UTC m=+59.221468113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.362735 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77jch"] Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.375799 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.381454 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.437027 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77jch"] Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.453225 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-d6slp" podStartSLOduration=35.453200364 podStartE2EDuration="35.453200364s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.382272968 +0000 UTC m=+58.743347380" watchObservedRunningTime="2025-12-04 17:27:27.453200364 +0000 UTC m=+58.814274766" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.454798 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xfvmf" podStartSLOduration=36.454788735 podStartE2EDuration="36.454788735s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.431386509 +0000 UTC m=+58.792460911" watchObservedRunningTime="2025-12-04 17:27:27.454788735 +0000 UTC m=+58.815863137" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.461350 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.461594 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-catalog-content\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.461708 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksx8q\" (UniqueName: \"kubernetes.io/projected/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-kube-api-access-ksx8q\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.461757 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-utilities\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.461890 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:27.961871288 +0000 UTC m=+59.322945690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.493651 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" podStartSLOduration=35.493629611 podStartE2EDuration="35.493629611s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.492361398 +0000 UTC m=+58.853435810" watchObservedRunningTime="2025-12-04 17:27:27.493629611 +0000 UTC m=+58.854704013" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.511489 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m59pp" podStartSLOduration=35.511468992 podStartE2EDuration="35.511468992s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.511061322 +0000 UTC m=+58.872135734" watchObservedRunningTime="2025-12-04 17:27:27.511468992 +0000 UTC m=+58.872543394" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.548365 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-xxt7h" podStartSLOduration=12.548347767 podStartE2EDuration="12.548347767s" podCreationTimestamp="2025-12-04 17:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.54768394 +0000 UTC m=+58.908758342" watchObservedRunningTime="2025-12-04 17:27:27.548347767 +0000 UTC m=+58.909422169" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.568701 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-22gwb"] Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.573385 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksx8q\" (UniqueName: \"kubernetes.io/projected/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-kube-api-access-ksx8q\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.573424 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.573445 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-utilities\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.573483 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-catalog-content\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.573878 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-catalog-content\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.574396 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.074380501 +0000 UTC m=+59.435454903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.574590 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-utilities\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.577062 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.580303 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.586179 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22gwb"] Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.635767 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-j8pfs" podStartSLOduration=35.63574912 podStartE2EDuration="35.63574912s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.635413951 +0000 UTC m=+58.996488353" watchObservedRunningTime="2025-12-04 17:27:27.63574912 +0000 UTC m=+58.996823522" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.643068 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" event={"ID":"09680d2b-7d6e-4dcd-bf38-d4642fe27ac2","Type":"ContainerStarted","Data":"43f27c5af4a77acd4d6d3b54a1cf00a5417c548ae08b11443970cea670584a7f"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.646265 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tqbdj" event={"ID":"218eaa87-4b22-4db2-8ff0-174995db7128","Type":"ContainerStarted","Data":"4773cafd723650d989412c1d44390953aa9683f170d079a61cad4e6027463034"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.665030 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksx8q\" (UniqueName: \"kubernetes.io/projected/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-kube-api-access-ksx8q\") pod \"certified-operators-77jch\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.678637 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.678841 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9kq9\" (UniqueName: \"kubernetes.io/projected/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-kube-api-access-x9kq9\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.678893 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-catalog-content\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.678945 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-utilities\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.679034 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.17902037 +0000 UTC m=+59.540094772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.679159 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8k5hb" podStartSLOduration=12.679134853 podStartE2EDuration="12.679134853s" podCreationTimestamp="2025-12-04 17:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.678530288 +0000 UTC m=+59.039604700" watchObservedRunningTime="2025-12-04 17:27:27.679134853 +0000 UTC m=+59.040209255" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.686496 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-wftpx" event={"ID":"c5237623-6755-491c-8345-90f85db04335","Type":"ContainerStarted","Data":"e02b8060b40baf8518d67265feffc422d82caec913d667ff527a01d969b38104"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.702462 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pcnsb" event={"ID":"b6df6df8-563e-4d8a-b9e5-29250531a399","Type":"ContainerStarted","Data":"b38ece347c73b9256923830cf9d97ca7068ea0369b5db1bba33ada534e695c42"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.709066 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hg692" podStartSLOduration=35.709031107 podStartE2EDuration="35.709031107s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.706480221 +0000 UTC m=+59.067554643" watchObservedRunningTime="2025-12-04 17:27:27.709031107 +0000 UTC m=+59.070105509" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.717237 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" event={"ID":"96f5544e-1a2b-4d58-9d9c-799509953821","Type":"ContainerStarted","Data":"47837dabc3f7906a60a4fd3b8e2c9756cd31eb1361f438d5bbc715ff0b9edfda"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.722402 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" event={"ID":"484ee778-914d-4c53-aa0e-6383472e1ebd","Type":"ContainerStarted","Data":"8d904d01aa6df43f1096fededa0ec3848c8038044b733e59254cdb5acd6e14fb"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.722449 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.747347 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.750933 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" event={"ID":"f7dc39c4-5e34-4d07-909f-85761440a108","Type":"ContainerStarted","Data":"a8fe8cbccc790a9065a9397185108cc145aa6900692ba73383fb8abf673d42f3"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.772721 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-trshb" event={"ID":"357c70a4-c799-43ba-8d28-ca99269d41fc","Type":"ContainerStarted","Data":"7316ef2b329e4544683b31647d340c9781895412ed96ad4fdaf453369b7120e3"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.776199 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" event={"ID":"532c984d-78f6-4e46-be62-53cb87748bcb","Type":"ContainerStarted","Data":"62aaac4be6163d1b56164b8661dabb250d0c89cec223ac80427193d94ef8d56f"} Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.780337 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-catalog-content\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.780499 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-utilities\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.780557 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.780584 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9kq9\" (UniqueName: \"kubernetes.io/projected/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-kube-api-access-x9kq9\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.786138 4948 patch_prober.go:28] interesting pod/downloads-7954f5f757-7jtdw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.786198 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7jtdw" podUID="c3556602-2a66-48fb-a187-85849f5c08e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.786725 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5275t" podStartSLOduration=35.786706107 podStartE2EDuration="35.786706107s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.756599928 +0000 UTC m=+59.117674340" watchObservedRunningTime="2025-12-04 17:27:27.786706107 +0000 UTC m=+59.147780519" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.786928 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-utilities\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.787006 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-catalog-content\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.787137 4948 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hbqk5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.787164 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" podUID="5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.791709 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.291692246 +0000 UTC m=+59.652766648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.822181 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" podStartSLOduration=35.822163735 podStartE2EDuration="35.822163735s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.820088451 +0000 UTC m=+59.181162853" watchObservedRunningTime="2025-12-04 17:27:27.822163735 +0000 UTC m=+59.183238137" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.830378 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qw297"] Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.848798 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.882611 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.883562 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-utilities\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.883750 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-catalog-content\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.883953 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrndn\" (UniqueName: \"kubernetes.io/projected/767a0495-90ff-412b-87da-a788808cda0e-kube-api-access-mrndn\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.883955 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9kq9\" (UniqueName: \"kubernetes.io/projected/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-kube-api-access-x9kq9\") pod \"community-operators-22gwb\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.884831 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.384810507 +0000 UTC m=+59.745884909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.885333 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qw297"] Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.910074 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.986363 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-utilities\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.986664 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-catalog-content\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.986686 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrndn\" (UniqueName: \"kubernetes.io/projected/767a0495-90ff-412b-87da-a788808cda0e-kube-api-access-mrndn\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.986753 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:27 crc kubenswrapper[4948]: E1204 17:27:27.987030 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.487020413 +0000 UTC m=+59.848094815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.987410 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-utilities\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.987613 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-catalog-content\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.988610 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-f7tp6" podStartSLOduration=35.988572623 podStartE2EDuration="35.988572623s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:27.89883013 +0000 UTC m=+59.259904522" watchObservedRunningTime="2025-12-04 17:27:27.988572623 +0000 UTC m=+59.349647055" Dec 04 17:27:27 crc kubenswrapper[4948]: I1204 17:27:27.989814 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m8dlm"] Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:27.999983 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.019938 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" podStartSLOduration=36.019917815 podStartE2EDuration="36.019917815s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:28.001280682 +0000 UTC m=+59.362355084" watchObservedRunningTime="2025-12-04 17:27:28.019917815 +0000 UTC m=+59.380992217" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.025640 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrndn\" (UniqueName: \"kubernetes.io/projected/767a0495-90ff-412b-87da-a788808cda0e-kube-api-access-mrndn\") pod \"certified-operators-qw297\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.025687 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8dlm"] Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.087536 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:28 crc kubenswrapper[4948]: E1204 17:27:28.087956 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.587936446 +0000 UTC m=+59.949010848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.107206 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:28 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:28 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:28 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.107255 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.186925 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-fltqd" podStartSLOduration=36.186909868 podStartE2EDuration="36.186909868s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:28.167188388 +0000 UTC m=+59.528262790" watchObservedRunningTime="2025-12-04 17:27:28.186909868 +0000 UTC m=+59.547984270" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.190124 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-utilities\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.190181 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.190260 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg76f\" (UniqueName: \"kubernetes.io/projected/74848112-8c60-4bcf-9f90-caee5c6e7f17-kube-api-access-gg76f\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.190337 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-catalog-content\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: E1204 17:27:28.190712 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.690699246 +0000 UTC m=+60.051773638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.214815 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" podStartSLOduration=37.21480118 podStartE2EDuration="37.21480118s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:28.212436579 +0000 UTC m=+59.573510981" watchObservedRunningTime="2025-12-04 17:27:28.21480118 +0000 UTC m=+59.575875582" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.238611 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.292438 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.292685 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-utilities\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.292746 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg76f\" (UniqueName: \"kubernetes.io/projected/74848112-8c60-4bcf-9f90-caee5c6e7f17-kube-api-access-gg76f\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.292789 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-catalog-content\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: E1204 17:27:28.292908 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.792895372 +0000 UTC m=+60.153969774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.293261 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-utilities\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.293579 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-catalog-content\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.335813 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg76f\" (UniqueName: \"kubernetes.io/projected/74848112-8c60-4bcf-9f90-caee5c6e7f17-kube-api-access-gg76f\") pod \"community-operators-m8dlm\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.337200 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.397197 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:28 crc kubenswrapper[4948]: E1204 17:27:28.397480 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.89746869 +0000 UTC m=+60.258543092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.499358 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:28 crc kubenswrapper[4948]: E1204 17:27:28.499728 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:28.999707797 +0000 UTC m=+60.360782199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.522428 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77jch"] Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.546858 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.547691 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.551609 4948 patch_prober.go:28] interesting pod/console-f9d7485db-m5k2z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.551638 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-m5k2z" podUID="f80f2233-6a99-49c2-a8fc-1bb335b2dd79" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.605991 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:28 crc kubenswrapper[4948]: E1204 17:27:28.606686 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:29.106671646 +0000 UTC m=+60.467746058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.706960 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:28 crc kubenswrapper[4948]: E1204 17:27:28.707377 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:29.207348122 +0000 UTC m=+60.568422524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.782488 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.782530 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.803369 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22gwb"] Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.811928 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:28 crc kubenswrapper[4948]: E1204 17:27:28.812243 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:29.312230778 +0000 UTC m=+60.673305180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.812598 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-trshb" event={"ID":"357c70a4-c799-43ba-8d28-ca99269d41fc","Type":"ContainerStarted","Data":"d0c8a48aacb5653bde9ed55c60ac48ade991858f8a95570f0b6d50df6a34c612"} Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.865215 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77jch" event={"ID":"18aaaacf-fb8c-4ba8-ab03-b89ec705114b","Type":"ContainerStarted","Data":"7679ee7c7516bd4f7e1ecea1a422715d324e4dcc0ed6a7744a8b07374902b3a2"} Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.903607 4948 patch_prober.go:28] interesting pod/downloads-7954f5f757-7jtdw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.903657 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7jtdw" podUID="c3556602-2a66-48fb-a187-85849f5c08e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.907131 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-trshb" podStartSLOduration=37.907113183999996 podStartE2EDuration="37.907113184s" podCreationTimestamp="2025-12-04 17:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:28.898367188 +0000 UTC m=+60.259441590" watchObservedRunningTime="2025-12-04 17:27:28.907113184 +0000 UTC m=+60.268187586" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.909971 4948 patch_prober.go:28] interesting pod/downloads-7954f5f757-7jtdw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.910014 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7jtdw" podUID="c3556602-2a66-48fb-a187-85849f5c08e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.959818 4948 patch_prober.go:28] interesting pod/downloads-7954f5f757-7jtdw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 04 17:27:28 crc kubenswrapper[4948]: I1204 17:27:28.959886 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7jtdw" podUID="c3556602-2a66-48fb-a187-85849f5c08e4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.012233 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" gracePeriod=30 Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.040891 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.070425 4948 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 04 17:27:29 crc kubenswrapper[4948]: E1204 17:27:29.081437 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:29.581414517 +0000 UTC m=+60.942488919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.101824 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:29 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:29 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:29 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.101882 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:29 crc kubenswrapper[4948]: E1204 17:27:29.105581 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.121670 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" event={"ID":"96f5544e-1a2b-4d58-9d9c-799509953821","Type":"ContainerStarted","Data":"0a0f5efc535a88794db887c229ee9f9eb1beb83f9a049306fb851b29f32c1aba"} Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.121749 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.121828 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.121852 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:29 crc kubenswrapper[4948]: E1204 17:27:29.129730 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:29 crc kubenswrapper[4948]: E1204 17:27:29.147721 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:29 crc kubenswrapper[4948]: E1204 17:27:29.147805 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.178478 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qw297"] Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.180998 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8dlm"] Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.193397 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:29 crc kubenswrapper[4948]: E1204 17:27:29.193900 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 17:27:29.693882359 +0000 UTC m=+61.054956751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g7mvh" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.295903 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:29 crc kubenswrapper[4948]: E1204 17:27:29.297125 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 17:27:29.797109461 +0000 UTC m=+61.158183863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.349693 4948 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-04T17:27:29.070626007Z","Handler":null,"Name":""} Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.358667 4948 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.358704 4948 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.398936 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.416939 4948 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.416981 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.615524 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g7mvh\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.707435 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.716750 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.754244 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jw8ps"] Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.755417 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.760196 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.777183 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jw8ps"] Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.797812 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.806339 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.914878 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-utilities\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.915293 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-catalog-content\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.915465 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6gp\" (UniqueName: \"kubernetes.io/projected/fc2914f1-50b7-4a3a-902e-000091874005-kube-api-access-mt6gp\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.991389 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw297" event={"ID":"767a0495-90ff-412b-87da-a788808cda0e","Type":"ContainerStarted","Data":"895288ee0f92b1776d4a032c4e6426f82738e10698ed9ac7b23de6081f6d8f1a"} Dec 04 17:27:29 crc kubenswrapper[4948]: I1204 17:27:29.991427 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw297" event={"ID":"767a0495-90ff-412b-87da-a788808cda0e","Type":"ContainerStarted","Data":"8c302319bd6d9b860dd4b8b93b849298c31dbb0f5b394914c579c9ee9e00827c"} Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.013873 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22gwb" event={"ID":"09f28c0e-7133-4236-9614-fe2fe6b5e2e2","Type":"ContainerStarted","Data":"f67efc76a2eaf418e39efe02be8a46e657086e9af357b999596d15001218a3f9"} Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.013918 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22gwb" event={"ID":"09f28c0e-7133-4236-9614-fe2fe6b5e2e2","Type":"ContainerStarted","Data":"cc789c44efb9db216713a106b4a06dfb67783d3675ef658c83de7436f2ced254"} Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.021694 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6gp\" (UniqueName: \"kubernetes.io/projected/fc2914f1-50b7-4a3a-902e-000091874005-kube-api-access-mt6gp\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.021784 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-utilities\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.021817 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-catalog-content\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.022358 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-catalog-content\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.022921 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-utilities\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.044391 4948 generic.go:334] "Generic (PLEG): container finished" podID="702984bc-83a3-4da1-bd02-f8879e78502d" containerID="1796ae814600a948a208a111d68114ac6381992376b853c84c0d4f878894e203" exitCode=0 Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.044458 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" event={"ID":"702984bc-83a3-4da1-bd02-f8879e78502d","Type":"ContainerDied","Data":"1796ae814600a948a208a111d68114ac6381992376b853c84c0d4f878894e203"} Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.051000 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6gp\" (UniqueName: \"kubernetes.io/projected/fc2914f1-50b7-4a3a-902e-000091874005-kube-api-access-mt6gp\") pod \"redhat-marketplace-jw8ps\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.053549 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" event={"ID":"96f5544e-1a2b-4d58-9d9c-799509953821","Type":"ContainerStarted","Data":"388cd942d0971b0e263f76e5e88e5c06ba8f483634a2981950a7d0df2eefac01"} Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.063637 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8dlm" event={"ID":"74848112-8c60-4bcf-9f90-caee5c6e7f17","Type":"ContainerStarted","Data":"56ad6fed6cfbb9103ab113b1217f6f37da42e53ec392b2a82155112d1ae3fb3f"} Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.063674 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8dlm" event={"ID":"74848112-8c60-4bcf-9f90-caee5c6e7f17","Type":"ContainerStarted","Data":"6cc9bed0900747e30994511253d755d608e2df2d304b41d3fe122bf34b835678"} Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.066902 4948 generic.go:334] "Generic (PLEG): container finished" podID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerID="623543b0e9d5a3e0b65bb1aff96a204b52c3882e33b9a9ecd54ce146b78e74cf" exitCode=0 Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.067133 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77jch" event={"ID":"18aaaacf-fb8c-4ba8-ab03-b89ec705114b","Type":"ContainerDied","Data":"623543b0e9d5a3e0b65bb1aff96a204b52c3882e33b9a9ecd54ce146b78e74cf"} Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.068482 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.076891 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-kk9wl" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.078613 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.086790 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tsgwn" podStartSLOduration=15.086747675 podStartE2EDuration="15.086747675s" podCreationTimestamp="2025-12-04 17:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:30.076436958 +0000 UTC m=+61.437511360" watchObservedRunningTime="2025-12-04 17:27:30.086747675 +0000 UTC m=+61.447822087" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.091813 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:30 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:30 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:30 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.091879 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.141285 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g7mvh"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.148106 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mzq82"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.149373 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.165258 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzq82"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.339835 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-utilities\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.340367 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-catalog-content\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.340395 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vcm\" (UniqueName: \"kubernetes.io/projected/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-kube-api-access-x7vcm\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.364530 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jw8ps"] Dec 04 17:27:30 crc kubenswrapper[4948]: W1204 17:27:30.382959 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc2914f1_50b7_4a3a_902e_000091874005.slice/crio-f52734e7864f36322f5a2d8d2e4a1fafcbbd32542e1566cee03af066a0cae5f7 WatchSource:0}: Error finding container f52734e7864f36322f5a2d8d2e4a1fafcbbd32542e1566cee03af066a0cae5f7: Status 404 returned error can't find the container with id f52734e7864f36322f5a2d8d2e4a1fafcbbd32542e1566cee03af066a0cae5f7 Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.441182 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-utilities\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.441580 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-catalog-content\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.441601 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vcm\" (UniqueName: \"kubernetes.io/projected/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-kube-api-access-x7vcm\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.441817 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-utilities\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.442115 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-catalog-content\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.467910 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vcm\" (UniqueName: \"kubernetes.io/projected/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-kube-api-access-x7vcm\") pod \"redhat-marketplace-mzq82\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.470781 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.540310 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.541031 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.542111 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.542146 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.544518 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.544729 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.559504 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l48pp"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.560731 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.566019 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.581798 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l48pp"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.600794 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.644812 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-utilities\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.644892 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt2qp\" (UniqueName: \"kubernetes.io/projected/cc8a9450-7e86-4194-962d-566fee4563df-kube-api-access-bt2qp\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.644977 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.645020 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.645103 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-catalog-content\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.645194 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.662593 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.717772 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzq82"] Dec 04 17:27:30 crc kubenswrapper[4948]: W1204 17:27:30.730766 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad77bb80_fd4a_4f6d_ac4d_d7a3e7c61aea.slice/crio-aafb7240c8b80b274814b0a4bb515c607c6ced0a249519c05074a5d1cefe9986 WatchSource:0}: Error finding container aafb7240c8b80b274814b0a4bb515c607c6ced0a249519c05074a5d1cefe9986: Status 404 returned error can't find the container with id aafb7240c8b80b274814b0a4bb515c607c6ced0a249519c05074a5d1cefe9986 Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.745869 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-catalog-content\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.745967 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-utilities\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.746002 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt2qp\" (UniqueName: \"kubernetes.io/projected/cc8a9450-7e86-4194-962d-566fee4563df-kube-api-access-bt2qp\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.746497 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-utilities\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.746519 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-catalog-content\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.757387 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gfq54"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.758403 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.766885 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt2qp\" (UniqueName: \"kubernetes.io/projected/cc8a9450-7e86-4194-962d-566fee4563df-kube-api-access-bt2qp\") pod \"redhat-operators-l48pp\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.768750 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gfq54"] Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.847381 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-catalog-content\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.847644 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-utilities\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.847678 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzqrj\" (UniqueName: \"kubernetes.io/projected/e260d86e-160c-4d14-896c-bcc2b35d2f90-kube-api-access-pzqrj\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.869528 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.898511 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.948186 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-utilities\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.948252 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzqrj\" (UniqueName: \"kubernetes.io/projected/e260d86e-160c-4d14-896c-bcc2b35d2f90-kube-api-access-pzqrj\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.948335 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-catalog-content\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.948775 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-utilities\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.948987 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-catalog-content\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:30 crc kubenswrapper[4948]: I1204 17:27:30.964006 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzqrj\" (UniqueName: \"kubernetes.io/projected/e260d86e-160c-4d14-896c-bcc2b35d2f90-kube-api-access-pzqrj\") pod \"redhat-operators-gfq54\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.073753 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.074643 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.080694 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.084880 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzq82" event={"ID":"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea","Type":"ContainerStarted","Data":"aafb7240c8b80b274814b0a4bb515c607c6ced0a249519c05074a5d1cefe9986"} Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.086214 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hv9v6" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.089165 4948 generic.go:334] "Generic (PLEG): container finished" podID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerID="56ad6fed6cfbb9103ab113b1217f6f37da42e53ec392b2a82155112d1ae3fb3f" exitCode=0 Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.089273 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8dlm" event={"ID":"74848112-8c60-4bcf-9f90-caee5c6e7f17","Type":"ContainerDied","Data":"56ad6fed6cfbb9103ab113b1217f6f37da42e53ec392b2a82155112d1ae3fb3f"} Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.093256 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:31 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:31 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:31 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.093320 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.166636 4948 generic.go:334] "Generic (PLEG): container finished" podID="767a0495-90ff-412b-87da-a788808cda0e" containerID="895288ee0f92b1776d4a032c4e6426f82738e10698ed9ac7b23de6081f6d8f1a" exitCode=0 Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.167209 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw297" event={"ID":"767a0495-90ff-412b-87da-a788808cda0e","Type":"ContainerDied","Data":"895288ee0f92b1776d4a032c4e6426f82738e10698ed9ac7b23de6081f6d8f1a"} Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.177667 4948 generic.go:334] "Generic (PLEG): container finished" podID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerID="f67efc76a2eaf418e39efe02be8a46e657086e9af357b999596d15001218a3f9" exitCode=0 Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.177761 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22gwb" event={"ID":"09f28c0e-7133-4236-9614-fe2fe6b5e2e2","Type":"ContainerDied","Data":"f67efc76a2eaf418e39efe02be8a46e657086e9af357b999596d15001218a3f9"} Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.181158 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jw8ps" event={"ID":"fc2914f1-50b7-4a3a-902e-000091874005","Type":"ContainerStarted","Data":"f52734e7864f36322f5a2d8d2e4a1fafcbbd32542e1566cee03af066a0cae5f7"} Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.186932 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" event={"ID":"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863","Type":"ContainerStarted","Data":"df842ebc971179dc5117715d5ee59142ae2db975ce699ff54d688a730381f8a7"} Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.359338 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l48pp"] Dec 04 17:27:31 crc kubenswrapper[4948]: W1204 17:27:31.384328 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc8a9450_7e86_4194_962d_566fee4563df.slice/crio-9acbcc6ea022375a18ee40b1101cd0cbcca3eb2bf6cea64a6989e7dc08574ed5 WatchSource:0}: Error finding container 9acbcc6ea022375a18ee40b1101cd0cbcca3eb2bf6cea64a6989e7dc08574ed5: Status 404 returned error can't find the container with id 9acbcc6ea022375a18ee40b1101cd0cbcca3eb2bf6cea64a6989e7dc08574ed5 Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.483824 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gfq54"] Dec 04 17:27:31 crc kubenswrapper[4948]: W1204 17:27:31.505637 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode260d86e_160c_4d14_896c_bcc2b35d2f90.slice/crio-6dd1dacef56ba1fd2710c82b3fd90b22fe61e4774e46c3899cc5840744082708 WatchSource:0}: Error finding container 6dd1dacef56ba1fd2710c82b3fd90b22fe61e4774e46c3899cc5840744082708: Status 404 returned error can't find the container with id 6dd1dacef56ba1fd2710c82b3fd90b22fe61e4774e46c3899cc5840744082708 Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.626795 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.674953 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.676428 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.691658 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f47382b4-4f12-471b-92aa-5d4ccb9c0bf0-metrics-certs\") pod \"network-metrics-daemon-t6lr5\" (UID: \"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0\") " pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.776187 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702984bc-83a3-4da1-bd02-f8879e78502d-secret-volume\") pod \"702984bc-83a3-4da1-bd02-f8879e78502d\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.776365 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702984bc-83a3-4da1-bd02-f8879e78502d-config-volume\") pod \"702984bc-83a3-4da1-bd02-f8879e78502d\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.776394 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtjfg\" (UniqueName: \"kubernetes.io/projected/702984bc-83a3-4da1-bd02-f8879e78502d-kube-api-access-dtjfg\") pod \"702984bc-83a3-4da1-bd02-f8879e78502d\" (UID: \"702984bc-83a3-4da1-bd02-f8879e78502d\") " Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.777588 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/702984bc-83a3-4da1-bd02-f8879e78502d-config-volume" (OuterVolumeSpecName: "config-volume") pod "702984bc-83a3-4da1-bd02-f8879e78502d" (UID: "702984bc-83a3-4da1-bd02-f8879e78502d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.779870 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702984bc-83a3-4da1-bd02-f8879e78502d-kube-api-access-dtjfg" (OuterVolumeSpecName: "kube-api-access-dtjfg") pod "702984bc-83a3-4da1-bd02-f8879e78502d" (UID: "702984bc-83a3-4da1-bd02-f8879e78502d"). InnerVolumeSpecName "kube-api-access-dtjfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.780018 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702984bc-83a3-4da1-bd02-f8879e78502d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "702984bc-83a3-4da1-bd02-f8879e78502d" (UID: "702984bc-83a3-4da1-bd02-f8879e78502d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.877551 4948 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/702984bc-83a3-4da1-bd02-f8879e78502d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.877591 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/702984bc-83a3-4da1-bd02-f8879e78502d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.877605 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtjfg\" (UniqueName: \"kubernetes.io/projected/702984bc-83a3-4da1-bd02-f8879e78502d-kube-api-access-dtjfg\") on node \"crc\" DevicePath \"\"" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.927707 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 17:27:31 crc kubenswrapper[4948]: I1204 17:27:31.935524 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t6lr5" Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.091945 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:32 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:32 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:32 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.092263 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.201468 4948 generic.go:334] "Generic (PLEG): container finished" podID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerID="eb23f9cac75a9ba791028371b7edb1e872659deb2ff29eff5b35ec1df2a63456" exitCode=0 Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.201741 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzq82" event={"ID":"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea","Type":"ContainerDied","Data":"eb23f9cac75a9ba791028371b7edb1e872659deb2ff29eff5b35ec1df2a63456"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.211292 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" event={"ID":"702984bc-83a3-4da1-bd02-f8879e78502d","Type":"ContainerDied","Data":"a81bba86c2fb25a63be051d86d5be788ba0c4d9461ec541645c0a2d6da266c18"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.211305 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf" Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.211323 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81bba86c2fb25a63be051d86d5be788ba0c4d9461ec541645c0a2d6da266c18" Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.221958 4948 generic.go:334] "Generic (PLEG): container finished" podID="4926a96f-d476-4163-9dfe-3eef0b0c53e4" containerID="8b7796e7a9bd3b896fedc75a3e1a8b84b291e3ae5a9e8937c96754d21cb33d59" exitCode=0 Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.222003 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4926a96f-d476-4163-9dfe-3eef0b0c53e4","Type":"ContainerDied","Data":"8b7796e7a9bd3b896fedc75a3e1a8b84b291e3ae5a9e8937c96754d21cb33d59"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.222082 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4926a96f-d476-4163-9dfe-3eef0b0c53e4","Type":"ContainerStarted","Data":"bf013dd3676a275082270e8b7d5fa58b6636f57e0d15c92f0617459da38ae25e"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.224142 4948 generic.go:334] "Generic (PLEG): container finished" podID="cc8a9450-7e86-4194-962d-566fee4563df" containerID="f3c7c51bc6b9731eccdf013115cdb196c6ce1d0b2a68096f6728213c63a84504" exitCode=0 Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.224322 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l48pp" event={"ID":"cc8a9450-7e86-4194-962d-566fee4563df","Type":"ContainerDied","Data":"f3c7c51bc6b9731eccdf013115cdb196c6ce1d0b2a68096f6728213c63a84504"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.224397 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l48pp" event={"ID":"cc8a9450-7e86-4194-962d-566fee4563df","Type":"ContainerStarted","Data":"9acbcc6ea022375a18ee40b1101cd0cbcca3eb2bf6cea64a6989e7dc08574ed5"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.232913 4948 generic.go:334] "Generic (PLEG): container finished" podID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerID="71e2e8a4b900e28eb00c0c94979d51e66fba196935d43792f6fbaa1e83ff13bc" exitCode=0 Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.232986 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfq54" event={"ID":"e260d86e-160c-4d14-896c-bcc2b35d2f90","Type":"ContainerDied","Data":"71e2e8a4b900e28eb00c0c94979d51e66fba196935d43792f6fbaa1e83ff13bc"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.233020 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfq54" event={"ID":"e260d86e-160c-4d14-896c-bcc2b35d2f90","Type":"ContainerStarted","Data":"6dd1dacef56ba1fd2710c82b3fd90b22fe61e4774e46c3899cc5840744082708"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.241411 4948 generic.go:334] "Generic (PLEG): container finished" podID="fc2914f1-50b7-4a3a-902e-000091874005" containerID="318fab065d04f84f9e82f28d4f88ab0e4e33b2baa91493b6477288e5566bcafa" exitCode=0 Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.241580 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jw8ps" event={"ID":"fc2914f1-50b7-4a3a-902e-000091874005","Type":"ContainerDied","Data":"318fab065d04f84f9e82f28d4f88ab0e4e33b2baa91493b6477288e5566bcafa"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.265897 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" event={"ID":"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863","Type":"ContainerStarted","Data":"c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e"} Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.266095 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.299857 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" podStartSLOduration=40.29983453 podStartE2EDuration="40.29983453s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:32.29791921 +0000 UTC m=+63.658993642" watchObservedRunningTime="2025-12-04 17:27:32.29983453 +0000 UTC m=+63.660908932" Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.443717 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t6lr5"] Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.617810 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.618088 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:32 crc kubenswrapper[4948]: I1204 17:27:32.623843 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.093087 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:33 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:33 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:33 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.093392 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.274626 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t6lr5" event={"ID":"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0","Type":"ContainerStarted","Data":"5d089f26c26fda2a1a2383fbba67378181a791b121153c8bc750e277138987b7"} Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.274674 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t6lr5" event={"ID":"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0","Type":"ContainerStarted","Data":"ec42b6371e80ca40c3918f7429b58d605dd5efbc96746ab00b49389a17e8b072"} Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.279494 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-trshb" Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.693699 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.810439 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kubelet-dir\") pod \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\" (UID: \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\") " Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.810541 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kube-api-access\") pod \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\" (UID: \"4926a96f-d476-4163-9dfe-3eef0b0c53e4\") " Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.822986 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4926a96f-d476-4163-9dfe-3eef0b0c53e4" (UID: "4926a96f-d476-4163-9dfe-3eef0b0c53e4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.827219 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4926a96f-d476-4163-9dfe-3eef0b0c53e4" (UID: "4926a96f-d476-4163-9dfe-3eef0b0c53e4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.911829 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 17:27:33 crc kubenswrapper[4948]: I1204 17:27:33.911862 4948 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4926a96f-d476-4163-9dfe-3eef0b0c53e4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.091150 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:34 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:34 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:34 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.091393 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.291179 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4926a96f-d476-4163-9dfe-3eef0b0c53e4","Type":"ContainerDied","Data":"bf013dd3676a275082270e8b7d5fa58b6636f57e0d15c92f0617459da38ae25e"} Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.291218 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf013dd3676a275082270e8b7d5fa58b6636f57e0d15c92f0617459da38ae25e" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.291458 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.300481 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t6lr5" event={"ID":"f47382b4-4f12-471b-92aa-5d4ccb9c0bf0","Type":"ContainerStarted","Data":"008016beabc42e9c88f716ee6a676c840091f774eeeed7eeafda2d70d96cba17"} Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.328968 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-t6lr5" podStartSLOduration=42.328945463 podStartE2EDuration="42.328945463s" podCreationTimestamp="2025-12-04 17:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:34.324488677 +0000 UTC m=+65.685563079" watchObservedRunningTime="2025-12-04 17:27:34.328945463 +0000 UTC m=+65.690019865" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.340347 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tqbdj" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.750478 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 17:27:34 crc kubenswrapper[4948]: E1204 17:27:34.750851 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4926a96f-d476-4163-9dfe-3eef0b0c53e4" containerName="pruner" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.750866 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4926a96f-d476-4163-9dfe-3eef0b0c53e4" containerName="pruner" Dec 04 17:27:34 crc kubenswrapper[4948]: E1204 17:27:34.750888 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702984bc-83a3-4da1-bd02-f8879e78502d" containerName="collect-profiles" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.750895 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="702984bc-83a3-4da1-bd02-f8879e78502d" containerName="collect-profiles" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.751032 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="4926a96f-d476-4163-9dfe-3eef0b0c53e4" containerName="pruner" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.751066 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="702984bc-83a3-4da1-bd02-f8879e78502d" containerName="collect-profiles" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.751634 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.753298 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.753806 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.754538 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.822575 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87fb4014-ea5d-4ea3-901d-9e68912d213b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"87fb4014-ea5d-4ea3-901d-9e68912d213b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.822632 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87fb4014-ea5d-4ea3-901d-9e68912d213b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"87fb4014-ea5d-4ea3-901d-9e68912d213b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.925305 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87fb4014-ea5d-4ea3-901d-9e68912d213b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"87fb4014-ea5d-4ea3-901d-9e68912d213b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.925405 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87fb4014-ea5d-4ea3-901d-9e68912d213b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"87fb4014-ea5d-4ea3-901d-9e68912d213b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.925438 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87fb4014-ea5d-4ea3-901d-9e68912d213b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"87fb4014-ea5d-4ea3-901d-9e68912d213b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:34 crc kubenswrapper[4948]: I1204 17:27:34.952246 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87fb4014-ea5d-4ea3-901d-9e68912d213b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"87fb4014-ea5d-4ea3-901d-9e68912d213b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:35 crc kubenswrapper[4948]: I1204 17:27:35.068591 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:35 crc kubenswrapper[4948]: I1204 17:27:35.094769 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:35 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:35 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:35 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:35 crc kubenswrapper[4948]: I1204 17:27:35.094834 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:35 crc kubenswrapper[4948]: I1204 17:27:35.465708 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 17:27:36 crc kubenswrapper[4948]: I1204 17:27:36.091911 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:36 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:36 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:36 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:36 crc kubenswrapper[4948]: I1204 17:27:36.092271 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:36 crc kubenswrapper[4948]: I1204 17:27:36.414580 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"87fb4014-ea5d-4ea3-901d-9e68912d213b","Type":"ContainerStarted","Data":"275bcc61acbee33f16144874358b1b3815261bc96f5ab37f28953daa9cddd9f6"} Dec 04 17:27:37 crc kubenswrapper[4948]: I1204 17:27:37.101644 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:37 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:37 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:37 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:37 crc kubenswrapper[4948]: I1204 17:27:37.101711 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:38 crc kubenswrapper[4948]: I1204 17:27:38.092929 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:38 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:38 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:38 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:38 crc kubenswrapper[4948]: I1204 17:27:38.093202 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:38 crc kubenswrapper[4948]: I1204 17:27:38.442702 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"87fb4014-ea5d-4ea3-901d-9e68912d213b","Type":"ContainerStarted","Data":"c105eb235deb8b8c2c292e10ea1b37730eceac6a018121ab6a198addbf551ca0"} Dec 04 17:27:38 crc kubenswrapper[4948]: I1204 17:27:38.547985 4948 patch_prober.go:28] interesting pod/console-f9d7485db-m5k2z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 04 17:27:38 crc kubenswrapper[4948]: I1204 17:27:38.548080 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-m5k2z" podUID="f80f2233-6a99-49c2-a8fc-1bb335b2dd79" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 04 17:27:38 crc kubenswrapper[4948]: I1204 17:27:38.903867 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7jtdw" Dec 04 17:27:39 crc kubenswrapper[4948]: E1204 17:27:39.044104 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:39 crc kubenswrapper[4948]: E1204 17:27:39.046179 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:39 crc kubenswrapper[4948]: E1204 17:27:39.049764 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:39 crc kubenswrapper[4948]: E1204 17:27:39.049937 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:27:39 crc kubenswrapper[4948]: I1204 17:27:39.091173 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:39 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:39 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:39 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:39 crc kubenswrapper[4948]: I1204 17:27:39.091241 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:39 crc kubenswrapper[4948]: I1204 17:27:39.476128 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=5.476063035 podStartE2EDuration="5.476063035s" podCreationTimestamp="2025-12-04 17:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:39.468887317 +0000 UTC m=+70.829961719" watchObservedRunningTime="2025-12-04 17:27:39.476063035 +0000 UTC m=+70.837137447" Dec 04 17:27:40 crc kubenswrapper[4948]: I1204 17:27:40.092145 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:40 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:40 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:40 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:40 crc kubenswrapper[4948]: I1204 17:27:40.092436 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:40 crc kubenswrapper[4948]: I1204 17:27:40.465779 4948 generic.go:334] "Generic (PLEG): container finished" podID="87fb4014-ea5d-4ea3-901d-9e68912d213b" containerID="c105eb235deb8b8c2c292e10ea1b37730eceac6a018121ab6a198addbf551ca0" exitCode=0 Dec 04 17:27:40 crc kubenswrapper[4948]: I1204 17:27:40.465823 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"87fb4014-ea5d-4ea3-901d-9e68912d213b","Type":"ContainerDied","Data":"c105eb235deb8b8c2c292e10ea1b37730eceac6a018121ab6a198addbf551ca0"} Dec 04 17:27:41 crc kubenswrapper[4948]: I1204 17:27:41.091291 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:41 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:41 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:41 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:41 crc kubenswrapper[4948]: I1204 17:27:41.091353 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:42 crc kubenswrapper[4948]: I1204 17:27:42.091447 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:42 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:42 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:42 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:42 crc kubenswrapper[4948]: I1204 17:27:42.091812 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:43 crc kubenswrapper[4948]: I1204 17:27:43.091904 4948 patch_prober.go:28] interesting pod/router-default-5444994796-zgswc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 17:27:43 crc kubenswrapper[4948]: [-]has-synced failed: reason withheld Dec 04 17:27:43 crc kubenswrapper[4948]: [+]process-running ok Dec 04 17:27:43 crc kubenswrapper[4948]: healthz check failed Dec 04 17:27:43 crc kubenswrapper[4948]: I1204 17:27:43.091967 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-zgswc" podUID="720d0657-f05b-415e-a89b-cec265b15235" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 17:27:44 crc kubenswrapper[4948]: I1204 17:27:44.091356 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:44 crc kubenswrapper[4948]: I1204 17:27:44.093763 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-zgswc" Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.555101 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.561507 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.623055 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.662613 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87fb4014-ea5d-4ea3-901d-9e68912d213b-kube-api-access\") pod \"87fb4014-ea5d-4ea3-901d-9e68912d213b\" (UID: \"87fb4014-ea5d-4ea3-901d-9e68912d213b\") " Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.662794 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87fb4014-ea5d-4ea3-901d-9e68912d213b-kubelet-dir\") pod \"87fb4014-ea5d-4ea3-901d-9e68912d213b\" (UID: \"87fb4014-ea5d-4ea3-901d-9e68912d213b\") " Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.662989 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87fb4014-ea5d-4ea3-901d-9e68912d213b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "87fb4014-ea5d-4ea3-901d-9e68912d213b" (UID: "87fb4014-ea5d-4ea3-901d-9e68912d213b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.663272 4948 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87fb4014-ea5d-4ea3-901d-9e68912d213b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.669620 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87fb4014-ea5d-4ea3-901d-9e68912d213b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "87fb4014-ea5d-4ea3-901d-9e68912d213b" (UID: "87fb4014-ea5d-4ea3-901d-9e68912d213b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:27:48 crc kubenswrapper[4948]: I1204 17:27:48.764648 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87fb4014-ea5d-4ea3-901d-9e68912d213b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 17:27:49 crc kubenswrapper[4948]: E1204 17:27:49.043989 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:49 crc kubenswrapper[4948]: E1204 17:27:49.045927 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:49 crc kubenswrapper[4948]: E1204 17:27:49.047533 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:49 crc kubenswrapper[4948]: E1204 17:27:49.047589 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:27:49 crc kubenswrapper[4948]: I1204 17:27:49.519275 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 17:27:49 crc kubenswrapper[4948]: I1204 17:27:49.519274 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"87fb4014-ea5d-4ea3-901d-9e68912d213b","Type":"ContainerDied","Data":"275bcc61acbee33f16144874358b1b3815261bc96f5ab37f28953daa9cddd9f6"} Dec 04 17:27:49 crc kubenswrapper[4948]: I1204 17:27:49.519729 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="275bcc61acbee33f16144874358b1b3815261bc96f5ab37f28953daa9cddd9f6" Dec 04 17:27:49 crc kubenswrapper[4948]: I1204 17:27:49.814199 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:27:52 crc kubenswrapper[4948]: I1204 17:27:52.930036 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 04 17:27:58 crc kubenswrapper[4948]: I1204 17:27:58.939334 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.939311228 podStartE2EDuration="6.939311228s" podCreationTimestamp="2025-12-04 17:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:27:58.936444087 +0000 UTC m=+90.297518529" watchObservedRunningTime="2025-12-04 17:27:58.939311228 +0000 UTC m=+90.300385670" Dec 04 17:27:59 crc kubenswrapper[4948]: E1204 17:27:59.041251 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:59 crc kubenswrapper[4948]: E1204 17:27:59.041700 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:59 crc kubenswrapper[4948]: E1204 17:27:59.042191 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:27:59 crc kubenswrapper[4948]: E1204 17:27:59.042247 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:27:59 crc kubenswrapper[4948]: I1204 17:27:59.114079 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cjplw" Dec 04 17:28:00 crc kubenswrapper[4948]: I1204 17:28:00.589197 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-h58bm_39ae03b6-0da8-43f7-84d2-300f5d0648af/kube-multus-additional-cni-plugins/0.log" Dec 04 17:28:00 crc kubenswrapper[4948]: I1204 17:28:00.589506 4948 generic.go:334] "Generic (PLEG): container finished" podID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" exitCode=137 Dec 04 17:28:00 crc kubenswrapper[4948]: I1204 17:28:00.589545 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" event={"ID":"39ae03b6-0da8-43f7-84d2-300f5d0648af","Type":"ContainerDied","Data":"92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b"} Dec 04 17:28:02 crc kubenswrapper[4948]: I1204 17:28:02.803512 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 17:28:09 crc kubenswrapper[4948]: E1204 17:28:09.041237 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:09 crc kubenswrapper[4948]: E1204 17:28:09.042248 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:09 crc kubenswrapper[4948]: E1204 17:28:09.042679 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:09 crc kubenswrapper[4948]: E1204 17:28:09.042753 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.736541 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 17:28:09 crc kubenswrapper[4948]: E1204 17:28:09.736755 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fb4014-ea5d-4ea3-901d-9e68912d213b" containerName="pruner" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.736767 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fb4014-ea5d-4ea3-901d-9e68912d213b" containerName="pruner" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.736872 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fb4014-ea5d-4ea3-901d-9e68912d213b" containerName="pruner" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.742219 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.747848 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.747948 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.751028 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.856814 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4776390c-6250-4b5c-ac19-f644150131ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4776390c-6250-4b5c-ac19-f644150131ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.856903 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4776390c-6250-4b5c-ac19-f644150131ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4776390c-6250-4b5c-ac19-f644150131ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.957613 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4776390c-6250-4b5c-ac19-f644150131ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4776390c-6250-4b5c-ac19-f644150131ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.957706 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4776390c-6250-4b5c-ac19-f644150131ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4776390c-6250-4b5c-ac19-f644150131ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.957805 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4776390c-6250-4b5c-ac19-f644150131ac-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4776390c-6250-4b5c-ac19-f644150131ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:09 crc kubenswrapper[4948]: I1204 17:28:09.980765 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4776390c-6250-4b5c-ac19-f644150131ac-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4776390c-6250-4b5c-ac19-f644150131ac\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:10 crc kubenswrapper[4948]: I1204 17:28:10.061275 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.535755 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.538329 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.545356 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.725186 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.725280 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-var-lock\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.725323 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78ea971c-3d85-48c2-8eed-04157dfa2f78-kube-api-access\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.826962 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78ea971c-3d85-48c2-8eed-04157dfa2f78-kube-api-access\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.827175 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.827308 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-var-lock\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.827463 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-kubelet-dir\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.827503 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-var-lock\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.856999 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78ea971c-3d85-48c2-8eed-04157dfa2f78-kube-api-access\") pod \"installer-9-crc\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.872220 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:28:14 crc kubenswrapper[4948]: I1204 17:28:14.940409 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 04 17:28:18 crc kubenswrapper[4948]: I1204 17:28:18.961851 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.961825416 podStartE2EDuration="4.961825416s" podCreationTimestamp="2025-12-04 17:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:28:18.956573719 +0000 UTC m=+110.317648131" watchObservedRunningTime="2025-12-04 17:28:18.961825416 +0000 UTC m=+110.322899818" Dec 04 17:28:19 crc kubenswrapper[4948]: E1204 17:28:19.041473 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:19 crc kubenswrapper[4948]: E1204 17:28:19.042003 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:19 crc kubenswrapper[4948]: E1204 17:28:19.042542 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:19 crc kubenswrapper[4948]: E1204 17:28:19.042578 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:28:19 crc kubenswrapper[4948]: E1204 17:28:19.891116 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 17:28:19 crc kubenswrapper[4948]: E1204 17:28:19.891616 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pzqrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gfq54_openshift-marketplace(e260d86e-160c-4d14-896c-bcc2b35d2f90): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 17:28:19 crc kubenswrapper[4948]: E1204 17:28:19.893004 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gfq54" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" Dec 04 17:28:23 crc kubenswrapper[4948]: E1204 17:28:23.566185 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gfq54" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" Dec 04 17:28:26 crc kubenswrapper[4948]: E1204 17:28:26.835938 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 17:28:26 crc kubenswrapper[4948]: E1204 17:28:26.836457 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mt6gp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jw8ps_openshift-marketplace(fc2914f1-50b7-4a3a-902e-000091874005): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 17:28:26 crc kubenswrapper[4948]: E1204 17:28:26.837736 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jw8ps" podUID="fc2914f1-50b7-4a3a-902e-000091874005" Dec 04 17:28:29 crc kubenswrapper[4948]: E1204 17:28:29.041313 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:29 crc kubenswrapper[4948]: E1204 17:28:29.041901 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:29 crc kubenswrapper[4948]: E1204 17:28:29.042353 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 04 17:28:29 crc kubenswrapper[4948]: E1204 17:28:29.042416 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:28:32 crc kubenswrapper[4948]: E1204 17:28:32.074810 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jw8ps" podUID="fc2914f1-50b7-4a3a-902e-000091874005" Dec 04 17:28:32 crc kubenswrapper[4948]: E1204 17:28:32.753112 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 17:28:32 crc kubenswrapper[4948]: E1204 17:28:32.753414 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x9kq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-22gwb_openshift-marketplace(09f28c0e-7133-4236-9614-fe2fe6b5e2e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 17:28:32 crc kubenswrapper[4948]: E1204 17:28:32.754842 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-22gwb" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.038854 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-22gwb" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.101911 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-h58bm_39ae03b6-0da8-43f7-84d2-300f5d0648af/kube-multus-additional-cni-plugins/0.log" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.101978 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.137224 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39ae03b6-0da8-43f7-84d2-300f5d0648af-tuning-conf-dir\") pod \"39ae03b6-0da8-43f7-84d2-300f5d0648af\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.137470 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39ae03b6-0da8-43f7-84d2-300f5d0648af-cni-sysctl-allowlist\") pod \"39ae03b6-0da8-43f7-84d2-300f5d0648af\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.137506 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/39ae03b6-0da8-43f7-84d2-300f5d0648af-ready\") pod \"39ae03b6-0da8-43f7-84d2-300f5d0648af\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.137311 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39ae03b6-0da8-43f7-84d2-300f5d0648af-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "39ae03b6-0da8-43f7-84d2-300f5d0648af" (UID: "39ae03b6-0da8-43f7-84d2-300f5d0648af"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.137798 4948 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/39ae03b6-0da8-43f7-84d2-300f5d0648af-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.138291 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39ae03b6-0da8-43f7-84d2-300f5d0648af-ready" (OuterVolumeSpecName: "ready") pod "39ae03b6-0da8-43f7-84d2-300f5d0648af" (UID: "39ae03b6-0da8-43f7-84d2-300f5d0648af"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.138323 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39ae03b6-0da8-43f7-84d2-300f5d0648af-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "39ae03b6-0da8-43f7-84d2-300f5d0648af" (UID: "39ae03b6-0da8-43f7-84d2-300f5d0648af"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.143243 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.143382 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x7vcm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mzq82_openshift-marketplace(ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.144735 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mzq82" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.160571 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.160778 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ksx8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-77jch_openshift-marketplace(18aaaacf-fb8c-4ba8-ab03-b89ec705114b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.162150 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-77jch" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.180898 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.181025 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mrndn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qw297_openshift-marketplace(767a0495-90ff-412b-87da-a788808cda0e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.183841 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qw297" podUID="767a0495-90ff-412b-87da-a788808cda0e" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.229443 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.229605 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gg76f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-m8dlm_openshift-marketplace(74848112-8c60-4bcf-9f90-caee5c6e7f17): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.230912 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-m8dlm" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.238371 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj4jp\" (UniqueName: \"kubernetes.io/projected/39ae03b6-0da8-43f7-84d2-300f5d0648af-kube-api-access-zj4jp\") pod \"39ae03b6-0da8-43f7-84d2-300f5d0648af\" (UID: \"39ae03b6-0da8-43f7-84d2-300f5d0648af\") " Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.238575 4948 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/39ae03b6-0da8-43f7-84d2-300f5d0648af-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.238588 4948 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/39ae03b6-0da8-43f7-84d2-300f5d0648af-ready\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.244138 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ae03b6-0da8-43f7-84d2-300f5d0648af-kube-api-access-zj4jp" (OuterVolumeSpecName: "kube-api-access-zj4jp") pod "39ae03b6-0da8-43f7-84d2-300f5d0648af" (UID: "39ae03b6-0da8-43f7-84d2-300f5d0648af"). InnerVolumeSpecName "kube-api-access-zj4jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.256258 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.256433 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bt2qp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-l48pp_openshift-marketplace(cc8a9450-7e86-4194-962d-566fee4563df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.257580 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-l48pp" podUID="cc8a9450-7e86-4194-962d-566fee4563df" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.340602 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj4jp\" (UniqueName: \"kubernetes.io/projected/39ae03b6-0da8-43f7-84d2-300f5d0648af-kube-api-access-zj4jp\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.502485 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.518671 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 17:28:34 crc kubenswrapper[4948]: W1204 17:28:34.539286 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod78ea971c_3d85_48c2_8eed_04157dfa2f78.slice/crio-7b3eca0953ee652440d3c6b74f5eb678df49ff02976f5b4e35b2eb65ce5804dd WatchSource:0}: Error finding container 7b3eca0953ee652440d3c6b74f5eb678df49ff02976f5b4e35b2eb65ce5804dd: Status 404 returned error can't find the container with id 7b3eca0953ee652440d3c6b74f5eb678df49ff02976f5b4e35b2eb65ce5804dd Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.787950 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4776390c-6250-4b5c-ac19-f644150131ac","Type":"ContainerStarted","Data":"d42e46fda262282e80ffa39f13547b75d4ff786883399f6b1e843196a5fe46d8"} Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.789711 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-h58bm_39ae03b6-0da8-43f7-84d2-300f5d0648af/kube-multus-additional-cni-plugins/0.log" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.789805 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.790291 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-h58bm" event={"ID":"39ae03b6-0da8-43f7-84d2-300f5d0648af","Type":"ContainerDied","Data":"1de308fd291145d205ca4cb5764822e4cf9e34c6ceef969ebcec255d629ef42a"} Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.790380 4948 scope.go:117] "RemoveContainer" containerID="92f643ab1a15bf28a8a648f8a11cfec2454da685ad694703a6e0ca569ee8874b" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.794103 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"78ea971c-3d85-48c2-8eed-04157dfa2f78","Type":"ContainerStarted","Data":"7b3eca0953ee652440d3c6b74f5eb678df49ff02976f5b4e35b2eb65ce5804dd"} Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.796670 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-m8dlm" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.797069 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qw297" podUID="767a0495-90ff-412b-87da-a788808cda0e" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.797569 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mzq82" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.797700 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-77jch" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" Dec 04 17:28:34 crc kubenswrapper[4948]: E1204 17:28:34.797942 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-l48pp" podUID="cc8a9450-7e86-4194-962d-566fee4563df" Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.859711 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-h58bm"] Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.863921 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-h58bm"] Dec 04 17:28:34 crc kubenswrapper[4948]: I1204 17:28:34.920915 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" path="/var/lib/kubelet/pods/39ae03b6-0da8-43f7-84d2-300f5d0648af/volumes" Dec 04 17:28:35 crc kubenswrapper[4948]: I1204 17:28:35.804961 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"78ea971c-3d85-48c2-8eed-04157dfa2f78","Type":"ContainerStarted","Data":"0538caba277a0f79e392e4bc6ccafb9dcdddde8ff5e48cb344a2c2302f65f557"} Dec 04 17:28:35 crc kubenswrapper[4948]: I1204 17:28:35.808009 4948 generic.go:334] "Generic (PLEG): container finished" podID="4776390c-6250-4b5c-ac19-f644150131ac" containerID="75713fc628dc7135e77e5377705902c6e70005eb3a0a55c585f0e86e206ab8b1" exitCode=0 Dec 04 17:28:35 crc kubenswrapper[4948]: I1204 17:28:35.808124 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4776390c-6250-4b5c-ac19-f644150131ac","Type":"ContainerDied","Data":"75713fc628dc7135e77e5377705902c6e70005eb3a0a55c585f0e86e206ab8b1"} Dec 04 17:28:35 crc kubenswrapper[4948]: I1204 17:28:35.830321 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=21.830289185 podStartE2EDuration="21.830289185s" podCreationTimestamp="2025-12-04 17:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:28:35.82099357 +0000 UTC m=+127.182068002" watchObservedRunningTime="2025-12-04 17:28:35.830289185 +0000 UTC m=+127.191363607" Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.064990 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.207232 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4776390c-6250-4b5c-ac19-f644150131ac-kube-api-access\") pod \"4776390c-6250-4b5c-ac19-f644150131ac\" (UID: \"4776390c-6250-4b5c-ac19-f644150131ac\") " Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.207395 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4776390c-6250-4b5c-ac19-f644150131ac-kubelet-dir\") pod \"4776390c-6250-4b5c-ac19-f644150131ac\" (UID: \"4776390c-6250-4b5c-ac19-f644150131ac\") " Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.207534 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4776390c-6250-4b5c-ac19-f644150131ac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4776390c-6250-4b5c-ac19-f644150131ac" (UID: "4776390c-6250-4b5c-ac19-f644150131ac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.207705 4948 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4776390c-6250-4b5c-ac19-f644150131ac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.211814 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4776390c-6250-4b5c-ac19-f644150131ac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4776390c-6250-4b5c-ac19-f644150131ac" (UID: "4776390c-6250-4b5c-ac19-f644150131ac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.309780 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4776390c-6250-4b5c-ac19-f644150131ac-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.825986 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4776390c-6250-4b5c-ac19-f644150131ac","Type":"ContainerDied","Data":"d42e46fda262282e80ffa39f13547b75d4ff786883399f6b1e843196a5fe46d8"} Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.826024 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42e46fda262282e80ffa39f13547b75d4ff786883399f6b1e843196a5fe46d8" Dec 04 17:28:37 crc kubenswrapper[4948]: I1204 17:28:37.826072 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 17:28:38 crc kubenswrapper[4948]: I1204 17:28:38.832647 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfq54" event={"ID":"e260d86e-160c-4d14-896c-bcc2b35d2f90","Type":"ContainerStarted","Data":"7c9af524c5e1c8fa11f5ffdd2ef4c79cdca7c451cfac807c4120142e29fbcf3b"} Dec 04 17:28:39 crc kubenswrapper[4948]: I1204 17:28:39.843270 4948 generic.go:334] "Generic (PLEG): container finished" podID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerID="7c9af524c5e1c8fa11f5ffdd2ef4c79cdca7c451cfac807c4120142e29fbcf3b" exitCode=0 Dec 04 17:28:39 crc kubenswrapper[4948]: I1204 17:28:39.843370 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfq54" event={"ID":"e260d86e-160c-4d14-896c-bcc2b35d2f90","Type":"ContainerDied","Data":"7c9af524c5e1c8fa11f5ffdd2ef4c79cdca7c451cfac807c4120142e29fbcf3b"} Dec 04 17:28:40 crc kubenswrapper[4948]: I1204 17:28:40.851290 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfq54" event={"ID":"e260d86e-160c-4d14-896c-bcc2b35d2f90","Type":"ContainerStarted","Data":"020a786a1089dba185e75e7c34a45004df606e4f5d17984c44d17509d31c69fd"} Dec 04 17:28:40 crc kubenswrapper[4948]: I1204 17:28:40.874076 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gfq54" podStartSLOduration=2.846362112 podStartE2EDuration="1m10.874054376s" podCreationTimestamp="2025-12-04 17:27:30 +0000 UTC" firstStartedPulling="2025-12-04 17:27:32.238572544 +0000 UTC m=+63.599646946" lastFinishedPulling="2025-12-04 17:28:40.266264808 +0000 UTC m=+131.627339210" observedRunningTime="2025-12-04 17:28:40.873942533 +0000 UTC m=+132.235016935" watchObservedRunningTime="2025-12-04 17:28:40.874054376 +0000 UTC m=+132.235128778" Dec 04 17:28:41 crc kubenswrapper[4948]: I1204 17:28:41.081031 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:28:41 crc kubenswrapper[4948]: I1204 17:28:41.081101 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:28:42 crc kubenswrapper[4948]: I1204 17:28:42.134572 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gfq54" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="registry-server" probeResult="failure" output=< Dec 04 17:28:42 crc kubenswrapper[4948]: timeout: failed to connect service ":50051" within 1s Dec 04 17:28:42 crc kubenswrapper[4948]: > Dec 04 17:28:48 crc kubenswrapper[4948]: I1204 17:28:48.890238 4948 generic.go:334] "Generic (PLEG): container finished" podID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerID="07a5a4fc53f48cb9b534fc2cd8c2f6c124fd43215d54cc5378f696a355e6ba80" exitCode=0 Dec 04 17:28:48 crc kubenswrapper[4948]: I1204 17:28:48.890304 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77jch" event={"ID":"18aaaacf-fb8c-4ba8-ab03-b89ec705114b","Type":"ContainerDied","Data":"07a5a4fc53f48cb9b534fc2cd8c2f6c124fd43215d54cc5378f696a355e6ba80"} Dec 04 17:28:48 crc kubenswrapper[4948]: I1204 17:28:48.893414 4948 generic.go:334] "Generic (PLEG): container finished" podID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerID="a1fa82178f6251efd1dd2b373547449bc54f872161e6de9a2766188efb2fbe7e" exitCode=0 Dec 04 17:28:48 crc kubenswrapper[4948]: I1204 17:28:48.893464 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzq82" event={"ID":"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea","Type":"ContainerDied","Data":"a1fa82178f6251efd1dd2b373547449bc54f872161e6de9a2766188efb2fbe7e"} Dec 04 17:28:51 crc kubenswrapper[4948]: I1204 17:28:51.128108 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:28:51 crc kubenswrapper[4948]: I1204 17:28:51.172158 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.142510 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gfq54"] Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.143006 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gfq54" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="registry-server" containerID="cri-o://020a786a1089dba185e75e7c34a45004df606e4f5d17984c44d17509d31c69fd" gracePeriod=2 Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.924450 4948 generic.go:334] "Generic (PLEG): container finished" podID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerID="9fb8cf56fea2f0a03dd3001535ece2773c621188fdb5c8ac5bbd4f0665e4adff" exitCode=0 Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.924511 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22gwb" event={"ID":"09f28c0e-7133-4236-9614-fe2fe6b5e2e2","Type":"ContainerDied","Data":"9fb8cf56fea2f0a03dd3001535ece2773c621188fdb5c8ac5bbd4f0665e4adff"} Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.926654 4948 generic.go:334] "Generic (PLEG): container finished" podID="fc2914f1-50b7-4a3a-902e-000091874005" containerID="9c897f54758cd272f87bb5aebee8cf08f477e45354f3232262543de51f30246d" exitCode=0 Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.926700 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jw8ps" event={"ID":"fc2914f1-50b7-4a3a-902e-000091874005","Type":"ContainerDied","Data":"9c897f54758cd272f87bb5aebee8cf08f477e45354f3232262543de51f30246d"} Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.939769 4948 generic.go:334] "Generic (PLEG): container finished" podID="767a0495-90ff-412b-87da-a788808cda0e" containerID="9fb06d76bb26ee7cde43b223783f56093240e02fbc10e589573a903040d18c79" exitCode=0 Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.939976 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw297" event={"ID":"767a0495-90ff-412b-87da-a788808cda0e","Type":"ContainerDied","Data":"9fb06d76bb26ee7cde43b223783f56093240e02fbc10e589573a903040d18c79"} Dec 04 17:28:54 crc kubenswrapper[4948]: I1204 17:28:54.946899 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l48pp" event={"ID":"cc8a9450-7e86-4194-962d-566fee4563df","Type":"ContainerStarted","Data":"734fdda00dc33c05fbc5c04eced0697d9bdaf0b4131ebc66140dd75e12913045"} Dec 04 17:28:55 crc kubenswrapper[4948]: I1204 17:28:55.956287 4948 generic.go:334] "Generic (PLEG): container finished" podID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerID="020a786a1089dba185e75e7c34a45004df606e4f5d17984c44d17509d31c69fd" exitCode=0 Dec 04 17:28:55 crc kubenswrapper[4948]: I1204 17:28:55.956360 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfq54" event={"ID":"e260d86e-160c-4d14-896c-bcc2b35d2f90","Type":"ContainerDied","Data":"020a786a1089dba185e75e7c34a45004df606e4f5d17984c44d17509d31c69fd"} Dec 04 17:28:55 crc kubenswrapper[4948]: I1204 17:28:55.958005 4948 generic.go:334] "Generic (PLEG): container finished" podID="cc8a9450-7e86-4194-962d-566fee4563df" containerID="734fdda00dc33c05fbc5c04eced0697d9bdaf0b4131ebc66140dd75e12913045" exitCode=0 Dec 04 17:28:55 crc kubenswrapper[4948]: I1204 17:28:55.958098 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l48pp" event={"ID":"cc8a9450-7e86-4194-962d-566fee4563df","Type":"ContainerDied","Data":"734fdda00dc33c05fbc5c04eced0697d9bdaf0b4131ebc66140dd75e12913045"} Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.674805 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.852160 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-utilities\") pod \"e260d86e-160c-4d14-896c-bcc2b35d2f90\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.852220 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzqrj\" (UniqueName: \"kubernetes.io/projected/e260d86e-160c-4d14-896c-bcc2b35d2f90-kube-api-access-pzqrj\") pod \"e260d86e-160c-4d14-896c-bcc2b35d2f90\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.852243 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-catalog-content\") pod \"e260d86e-160c-4d14-896c-bcc2b35d2f90\" (UID: \"e260d86e-160c-4d14-896c-bcc2b35d2f90\") " Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.854162 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-utilities" (OuterVolumeSpecName: "utilities") pod "e260d86e-160c-4d14-896c-bcc2b35d2f90" (UID: "e260d86e-160c-4d14-896c-bcc2b35d2f90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.858573 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e260d86e-160c-4d14-896c-bcc2b35d2f90-kube-api-access-pzqrj" (OuterVolumeSpecName: "kube-api-access-pzqrj") pod "e260d86e-160c-4d14-896c-bcc2b35d2f90" (UID: "e260d86e-160c-4d14-896c-bcc2b35d2f90"). InnerVolumeSpecName "kube-api-access-pzqrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.953722 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.954029 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzqrj\" (UniqueName: \"kubernetes.io/projected/e260d86e-160c-4d14-896c-bcc2b35d2f90-kube-api-access-pzqrj\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.964481 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw297" event={"ID":"767a0495-90ff-412b-87da-a788808cda0e","Type":"ContainerStarted","Data":"c4aef8e3ca34ea978c9519d44d7fa2af6bc13a5f8e4ce8a1be45752597527294"} Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.966183 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfq54" event={"ID":"e260d86e-160c-4d14-896c-bcc2b35d2f90","Type":"ContainerDied","Data":"6dd1dacef56ba1fd2710c82b3fd90b22fe61e4774e46c3899cc5840744082708"} Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.966225 4948 scope.go:117] "RemoveContainer" containerID="020a786a1089dba185e75e7c34a45004df606e4f5d17984c44d17509d31c69fd" Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.966227 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfq54" Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.968891 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzq82" event={"ID":"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea","Type":"ContainerStarted","Data":"5f7bac73c2ee577e084fc4725cfe50810e76989162f83375d0f8d811439e65cd"} Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.971794 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8dlm" event={"ID":"74848112-8c60-4bcf-9f90-caee5c6e7f17","Type":"ContainerStarted","Data":"59a5193f673df9d58cd33263d2961191c376a81cfd68198984de1c09688321f0"} Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.975976 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77jch" event={"ID":"18aaaacf-fb8c-4ba8-ab03-b89ec705114b","Type":"ContainerStarted","Data":"77d8a8f65056279c810b1a6c249fc35052efd8d27e3c16327f9fc6e70ecab2f6"} Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.978219 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e260d86e-160c-4d14-896c-bcc2b35d2f90" (UID: "e260d86e-160c-4d14-896c-bcc2b35d2f90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:28:56 crc kubenswrapper[4948]: I1204 17:28:56.987262 4948 scope.go:117] "RemoveContainer" containerID="7c9af524c5e1c8fa11f5ffdd2ef4c79cdca7c451cfac807c4120142e29fbcf3b" Dec 04 17:28:57 crc kubenswrapper[4948]: I1204 17:28:57.004663 4948 scope.go:117] "RemoveContainer" containerID="71e2e8a4b900e28eb00c0c94979d51e66fba196935d43792f6fbaa1e83ff13bc" Dec 04 17:28:57 crc kubenswrapper[4948]: I1204 17:28:57.055284 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e260d86e-160c-4d14-896c-bcc2b35d2f90-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:28:57 crc kubenswrapper[4948]: I1204 17:28:57.293338 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gfq54"] Dec 04 17:28:57 crc kubenswrapper[4948]: I1204 17:28:57.295809 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gfq54"] Dec 04 17:28:57 crc kubenswrapper[4948]: I1204 17:28:57.983983 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l48pp" event={"ID":"cc8a9450-7e86-4194-962d-566fee4563df","Type":"ContainerStarted","Data":"ae16fa2c605d20dd73cc12faf0c919c48f07d7430c67acb9feb5c7674881d65a"} Dec 04 17:28:57 crc kubenswrapper[4948]: I1204 17:28:57.987095 4948 generic.go:334] "Generic (PLEG): container finished" podID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerID="59a5193f673df9d58cd33263d2961191c376a81cfd68198984de1c09688321f0" exitCode=0 Dec 04 17:28:57 crc kubenswrapper[4948]: I1204 17:28:57.987156 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8dlm" event={"ID":"74848112-8c60-4bcf-9f90-caee5c6e7f17","Type":"ContainerDied","Data":"59a5193f673df9d58cd33263d2961191c376a81cfd68198984de1c09688321f0"} Dec 04 17:28:58 crc kubenswrapper[4948]: I1204 17:28:58.003497 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l48pp" podStartSLOduration=2.9764073140000002 podStartE2EDuration="1m28.003479829s" podCreationTimestamp="2025-12-04 17:27:30 +0000 UTC" firstStartedPulling="2025-12-04 17:27:32.226920032 +0000 UTC m=+63.587994434" lastFinishedPulling="2025-12-04 17:28:57.253992547 +0000 UTC m=+148.615066949" observedRunningTime="2025-12-04 17:28:58.00324019 +0000 UTC m=+149.364314602" watchObservedRunningTime="2025-12-04 17:28:58.003479829 +0000 UTC m=+149.364554231" Dec 04 17:28:58 crc kubenswrapper[4948]: I1204 17:28:58.033381 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77jch" podStartSLOduration=4.663183302 podStartE2EDuration="1m31.033356461s" podCreationTimestamp="2025-12-04 17:27:27 +0000 UTC" firstStartedPulling="2025-12-04 17:27:30.068257156 +0000 UTC m=+61.429331558" lastFinishedPulling="2025-12-04 17:28:56.438430315 +0000 UTC m=+147.799504717" observedRunningTime="2025-12-04 17:28:58.032210206 +0000 UTC m=+149.393284608" watchObservedRunningTime="2025-12-04 17:28:58.033356461 +0000 UTC m=+149.394430873" Dec 04 17:28:58 crc kubenswrapper[4948]: I1204 17:28:58.082137 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qw297" podStartSLOduration=5.702865429 podStartE2EDuration="1m31.082115908s" podCreationTimestamp="2025-12-04 17:27:27 +0000 UTC" firstStartedPulling="2025-12-04 17:27:31.174908757 +0000 UTC m=+62.535983159" lastFinishedPulling="2025-12-04 17:28:56.554159226 +0000 UTC m=+147.915233638" observedRunningTime="2025-12-04 17:28:58.058189787 +0000 UTC m=+149.419264209" watchObservedRunningTime="2025-12-04 17:28:58.082115908 +0000 UTC m=+149.443190320" Dec 04 17:28:58 crc kubenswrapper[4948]: I1204 17:28:58.096333 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mzq82" podStartSLOduration=3.784403685 podStartE2EDuration="1m28.09631346s" podCreationTimestamp="2025-12-04 17:27:30 +0000 UTC" firstStartedPulling="2025-12-04 17:27:32.208235688 +0000 UTC m=+63.569310090" lastFinishedPulling="2025-12-04 17:28:56.520145463 +0000 UTC m=+147.881219865" observedRunningTime="2025-12-04 17:28:58.093922797 +0000 UTC m=+149.454997199" watchObservedRunningTime="2025-12-04 17:28:58.09631346 +0000 UTC m=+149.457387862" Dec 04 17:28:58 crc kubenswrapper[4948]: I1204 17:28:58.239501 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:28:58 crc kubenswrapper[4948]: I1204 17:28:58.239560 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:28:58 crc kubenswrapper[4948]: I1204 17:28:58.921064 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" path="/var/lib/kubelet/pods/e260d86e-160c-4d14-896c-bcc2b35d2f90/volumes" Dec 04 17:28:59 crc kubenswrapper[4948]: I1204 17:28:59.000842 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22gwb" event={"ID":"09f28c0e-7133-4236-9614-fe2fe6b5e2e2","Type":"ContainerStarted","Data":"c3eaa4af95baccf8f31eb1654650c547d663fbdc728faf99613a4463715487d7"} Dec 04 17:28:59 crc kubenswrapper[4948]: I1204 17:28:59.004086 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jw8ps" event={"ID":"fc2914f1-50b7-4a3a-902e-000091874005","Type":"ContainerStarted","Data":"a898f9fce9fab0e97ceda44d90f84a5d3d5abc7ea3d7144b9d59aabfa934b25d"} Dec 04 17:28:59 crc kubenswrapper[4948]: I1204 17:28:59.020677 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-22gwb" podStartSLOduration=5.077216493 podStartE2EDuration="1m32.020655653s" podCreationTimestamp="2025-12-04 17:27:27 +0000 UTC" firstStartedPulling="2025-12-04 17:27:31.179573778 +0000 UTC m=+62.540648180" lastFinishedPulling="2025-12-04 17:28:58.123012938 +0000 UTC m=+149.484087340" observedRunningTime="2025-12-04 17:28:59.017850254 +0000 UTC m=+150.378924656" watchObservedRunningTime="2025-12-04 17:28:59.020655653 +0000 UTC m=+150.381730055" Dec 04 17:28:59 crc kubenswrapper[4948]: I1204 17:28:59.041003 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jw8ps" podStartSLOduration=4.655095777 podStartE2EDuration="1m30.040983164s" podCreationTimestamp="2025-12-04 17:27:29 +0000 UTC" firstStartedPulling="2025-12-04 17:27:32.247364971 +0000 UTC m=+63.608439373" lastFinishedPulling="2025-12-04 17:28:57.633252358 +0000 UTC m=+148.994326760" observedRunningTime="2025-12-04 17:28:59.037732168 +0000 UTC m=+150.398806590" watchObservedRunningTime="2025-12-04 17:28:59.040983164 +0000 UTC m=+150.402057566" Dec 04 17:28:59 crc kubenswrapper[4948]: I1204 17:28:59.277627 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qw297" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="registry-server" probeResult="failure" output=< Dec 04 17:28:59 crc kubenswrapper[4948]: timeout: failed to connect service ":50051" within 1s Dec 04 17:28:59 crc kubenswrapper[4948]: > Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.010906 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8dlm" event={"ID":"74848112-8c60-4bcf-9f90-caee5c6e7f17","Type":"ContainerStarted","Data":"3773e01b0f6c8847e8deef225de284322dbf71150a847e6d9f9d8190469e06cb"} Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.032471 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m8dlm" podStartSLOduration=5.180250296 podStartE2EDuration="1m33.032454059s" podCreationTimestamp="2025-12-04 17:27:27 +0000 UTC" firstStartedPulling="2025-12-04 17:27:31.095285175 +0000 UTC m=+62.456359577" lastFinishedPulling="2025-12-04 17:28:58.947488938 +0000 UTC m=+150.308563340" observedRunningTime="2025-12-04 17:29:00.031177169 +0000 UTC m=+151.392251641" watchObservedRunningTime="2025-12-04 17:29:00.032454059 +0000 UTC m=+151.393528461" Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.079460 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.079519 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.143355 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.471546 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.471853 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.533200 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.900172 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:29:00 crc kubenswrapper[4948]: I1204 17:29:00.900210 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:29:01 crc kubenswrapper[4948]: I1204 17:29:01.063501 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:29:01 crc kubenswrapper[4948]: I1204 17:29:01.942000 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l48pp" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="registry-server" probeResult="failure" output=< Dec 04 17:29:01 crc kubenswrapper[4948]: timeout: failed to connect service ":50051" within 1s Dec 04 17:29:01 crc kubenswrapper[4948]: > Dec 04 17:29:02 crc kubenswrapper[4948]: I1204 17:29:02.343679 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzq82"] Dec 04 17:29:04 crc kubenswrapper[4948]: I1204 17:29:04.029455 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mzq82" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerName="registry-server" containerID="cri-o://5f7bac73c2ee577e084fc4725cfe50810e76989162f83375d0f8d811439e65cd" gracePeriod=2 Dec 04 17:29:07 crc kubenswrapper[4948]: I1204 17:29:07.748381 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:29:07 crc kubenswrapper[4948]: I1204 17:29:07.748812 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:29:07 crc kubenswrapper[4948]: I1204 17:29:07.808618 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:29:07 crc kubenswrapper[4948]: I1204 17:29:07.922645 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:29:07 crc kubenswrapper[4948]: I1204 17:29:07.922747 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:29:07 crc kubenswrapper[4948]: I1204 17:29:07.967889 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:29:08 crc kubenswrapper[4948]: I1204 17:29:08.281491 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:29:08 crc kubenswrapper[4948]: I1204 17:29:08.317388 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:29:08 crc kubenswrapper[4948]: I1204 17:29:08.338494 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:29:08 crc kubenswrapper[4948]: I1204 17:29:08.338566 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:29:08 crc kubenswrapper[4948]: I1204 17:29:08.484101 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.166382 4948 generic.go:334] "Generic (PLEG): container finished" podID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerID="5f7bac73c2ee577e084fc4725cfe50810e76989162f83375d0f8d811439e65cd" exitCode=0 Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.166471 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzq82" event={"ID":"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea","Type":"ContainerDied","Data":"5f7bac73c2ee577e084fc4725cfe50810e76989162f83375d0f8d811439e65cd"} Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.208844 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.208894 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.270264 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.418153 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-catalog-content\") pod \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.418251 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-utilities\") pod \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.418299 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7vcm\" (UniqueName: \"kubernetes.io/projected/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-kube-api-access-x7vcm\") pod \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\" (UID: \"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea\") " Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.419011 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-utilities" (OuterVolumeSpecName: "utilities") pod "ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" (UID: "ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.423832 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-kube-api-access-x7vcm" (OuterVolumeSpecName: "kube-api-access-x7vcm") pod "ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" (UID: "ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea"). InnerVolumeSpecName "kube-api-access-x7vcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.438780 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" (UID: "ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.452237 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qw297"] Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.520179 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.520221 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.520235 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7vcm\" (UniqueName: \"kubernetes.io/projected/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea-kube-api-access-x7vcm\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:09 crc kubenswrapper[4948]: I1204 17:29:09.552950 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2tl2h"] Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.121335 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.175711 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzq82" Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.175700 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzq82" event={"ID":"ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea","Type":"ContainerDied","Data":"aafb7240c8b80b274814b0a4bb515c607c6ced0a249519c05074a5d1cefe9986"} Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.175765 4948 scope.go:117] "RemoveContainer" containerID="5f7bac73c2ee577e084fc4725cfe50810e76989162f83375d0f8d811439e65cd" Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.205684 4948 scope.go:117] "RemoveContainer" containerID="a1fa82178f6251efd1dd2b373547449bc54f872161e6de9a2766188efb2fbe7e" Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.218544 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzq82"] Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.219016 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.222799 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzq82"] Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.227950 4948 scope.go:117] "RemoveContainer" containerID="eb23f9cac75a9ba791028371b7edb1e872659deb2ff29eff5b35ec1df2a63456" Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.625176 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.625638 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.852336 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8dlm"] Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.921155 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" path="/var/lib/kubelet/pods/ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea/volumes" Dec 04 17:29:10 crc kubenswrapper[4948]: I1204 17:29:10.948219 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:29:11 crc kubenswrapper[4948]: I1204 17:29:11.010380 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:29:11 crc kubenswrapper[4948]: I1204 17:29:11.188842 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qw297" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="registry-server" containerID="cri-o://c4aef8e3ca34ea978c9519d44d7fa2af6bc13a5f8e4ce8a1be45752597527294" gracePeriod=2 Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.192888 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m8dlm" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerName="registry-server" containerID="cri-o://3773e01b0f6c8847e8deef225de284322dbf71150a847e6d9f9d8190469e06cb" gracePeriod=2 Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499443 4948 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.499671 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerName="extract-utilities" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499683 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerName="extract-utilities" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.499695 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerName="extract-content" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499701 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerName="extract-content" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.499712 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499718 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.499729 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="extract-content" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499736 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="extract-content" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.499747 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="registry-server" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499754 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="registry-server" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.499762 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="extract-utilities" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499768 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="extract-utilities" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.499799 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerName="registry-server" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499807 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerName="registry-server" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.499821 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4776390c-6250-4b5c-ac19-f644150131ac" containerName="pruner" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499827 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4776390c-6250-4b5c-ac19-f644150131ac" containerName="pruner" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499914 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ae03b6-0da8-43f7-84d2-300f5d0648af" containerName="kube-multus-additional-cni-plugins" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499924 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad77bb80-fd4a-4f6d-ac4d-d7a3e7c61aea" containerName="registry-server" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499934 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="4776390c-6250-4b5c-ac19-f644150131ac" containerName="pruner" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.499946 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e260d86e-160c-4d14-896c-bcc2b35d2f90" containerName="registry-server" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.500257 4948 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.500481 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b" gracePeriod=15 Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.500523 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.500532 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7" gracePeriod=15 Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.500592 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe" gracePeriod=15 Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.500643 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d" gracePeriod=15 Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.500679 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5" gracePeriod=15 Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.502572 4948 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.502838 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.502850 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.502857 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.502863 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.502874 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.502880 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.502891 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.502897 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.502906 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.502912 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 17:29:12 crc kubenswrapper[4948]: E1204 17:29:12.502923 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.502929 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.503020 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.503030 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.503058 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.503067 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.503077 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.529275 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.660962 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.661029 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.661120 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.661167 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.661190 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.661210 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.661232 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.661254 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762295 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762584 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762624 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762664 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762710 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762733 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762754 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762775 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762840 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762421 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762897 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762917 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762937 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762967 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.762985 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.763006 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: I1204 17:29:12.826666 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:29:12 crc kubenswrapper[4948]: W1204 17:29:12.848284 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b1b801d9788774935baa52aab9867c013b638f2dd71e4b253ecba7bd97e36d67 WatchSource:0}: Error finding container b1b801d9788774935baa52aab9867c013b638f2dd71e4b253ecba7bd97e36d67: Status 404 returned error can't find the container with id b1b801d9788774935baa52aab9867c013b638f2dd71e4b253ecba7bd97e36d67 Dec 04 17:29:13 crc kubenswrapper[4948]: I1204 17:29:13.200448 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b1b801d9788774935baa52aab9867c013b638f2dd71e4b253ecba7bd97e36d67"} Dec 04 17:29:14 crc kubenswrapper[4948]: E1204 17:29:14.194015 4948 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e1353b392f780 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 17:29:14.191779712 +0000 UTC m=+165.552854114,LastTimestamp:2025-12-04 17:29:14.191779712 +0000 UTC m=+165.552854114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.210641 4948 generic.go:334] "Generic (PLEG): container finished" podID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerID="3773e01b0f6c8847e8deef225de284322dbf71150a847e6d9f9d8190469e06cb" exitCode=0 Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.210720 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8dlm" event={"ID":"74848112-8c60-4bcf-9f90-caee5c6e7f17","Type":"ContainerDied","Data":"3773e01b0f6c8847e8deef225de284322dbf71150a847e6d9f9d8190469e06cb"} Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.212407 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de"} Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.212973 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.215154 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.215770 4948 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7" exitCode=0 Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.215790 4948 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe" exitCode=0 Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.215798 4948 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d" exitCode=0 Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.215806 4948 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5" exitCode=2 Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.217624 4948 generic.go:334] "Generic (PLEG): container finished" podID="767a0495-90ff-412b-87da-a788808cda0e" containerID="c4aef8e3ca34ea978c9519d44d7fa2af6bc13a5f8e4ce8a1be45752597527294" exitCode=0 Dec 04 17:29:14 crc kubenswrapper[4948]: I1204 17:29:14.217699 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw297" event={"ID":"767a0495-90ff-412b-87da-a788808cda0e","Type":"ContainerDied","Data":"c4aef8e3ca34ea978c9519d44d7fa2af6bc13a5f8e4ce8a1be45752597527294"} Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.010027 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.011685 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.012139 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.201138 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-utilities\") pod \"74848112-8c60-4bcf-9f90-caee5c6e7f17\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.201197 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg76f\" (UniqueName: \"kubernetes.io/projected/74848112-8c60-4bcf-9f90-caee5c6e7f17-kube-api-access-gg76f\") pod \"74848112-8c60-4bcf-9f90-caee5c6e7f17\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.201331 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-catalog-content\") pod \"74848112-8c60-4bcf-9f90-caee5c6e7f17\" (UID: \"74848112-8c60-4bcf-9f90-caee5c6e7f17\") " Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.202004 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-utilities" (OuterVolumeSpecName: "utilities") pod "74848112-8c60-4bcf-9f90-caee5c6e7f17" (UID: "74848112-8c60-4bcf-9f90-caee5c6e7f17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.207337 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74848112-8c60-4bcf-9f90-caee5c6e7f17-kube-api-access-gg76f" (OuterVolumeSpecName: "kube-api-access-gg76f") pod "74848112-8c60-4bcf-9f90-caee5c6e7f17" (UID: "74848112-8c60-4bcf-9f90-caee5c6e7f17"). InnerVolumeSpecName "kube-api-access-gg76f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.228634 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8dlm" event={"ID":"74848112-8c60-4bcf-9f90-caee5c6e7f17","Type":"ContainerDied","Data":"6cc9bed0900747e30994511253d755d608e2df2d304b41d3fe122bf34b835678"} Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.228701 4948 scope.go:117] "RemoveContainer" containerID="3773e01b0f6c8847e8deef225de284322dbf71150a847e6d9f9d8190469e06cb" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.228880 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8dlm" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.230204 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.230706 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.233267 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"78ea971c-3d85-48c2-8eed-04157dfa2f78","Type":"ContainerDied","Data":"0538caba277a0f79e392e4bc6ccafb9dcdddde8ff5e48cb344a2c2302f65f557"} Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.233945 4948 generic.go:334] "Generic (PLEG): container finished" podID="78ea971c-3d85-48c2-8eed-04157dfa2f78" containerID="0538caba277a0f79e392e4bc6ccafb9dcdddde8ff5e48cb344a2c2302f65f557" exitCode=0 Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.235180 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.235614 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.236002 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.250524 4948 scope.go:117] "RemoveContainer" containerID="59a5193f673df9d58cd33263d2961191c376a81cfd68198984de1c09688321f0" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.258950 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74848112-8c60-4bcf-9f90-caee5c6e7f17" (UID: "74848112-8c60-4bcf-9f90-caee5c6e7f17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.285197 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.285789 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.286153 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.286492 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.286803 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.286863 4948 scope.go:117] "RemoveContainer" containerID="56ad6fed6cfbb9103ab113b1217f6f37da42e53ec392b2a82155112d1ae3fb3f" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.303198 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.303250 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg76f\" (UniqueName: \"kubernetes.io/projected/74848112-8c60-4bcf-9f90-caee5c6e7f17-kube-api-access-gg76f\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.303262 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74848112-8c60-4bcf-9f90-caee5c6e7f17-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.404578 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-utilities\") pod \"767a0495-90ff-412b-87da-a788808cda0e\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.404673 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrndn\" (UniqueName: \"kubernetes.io/projected/767a0495-90ff-412b-87da-a788808cda0e-kube-api-access-mrndn\") pod \"767a0495-90ff-412b-87da-a788808cda0e\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.404774 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-catalog-content\") pod \"767a0495-90ff-412b-87da-a788808cda0e\" (UID: \"767a0495-90ff-412b-87da-a788808cda0e\") " Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.406368 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-utilities" (OuterVolumeSpecName: "utilities") pod "767a0495-90ff-412b-87da-a788808cda0e" (UID: "767a0495-90ff-412b-87da-a788808cda0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.411697 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767a0495-90ff-412b-87da-a788808cda0e-kube-api-access-mrndn" (OuterVolumeSpecName: "kube-api-access-mrndn") pod "767a0495-90ff-412b-87da-a788808cda0e" (UID: "767a0495-90ff-412b-87da-a788808cda0e"). InnerVolumeSpecName "kube-api-access-mrndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.463365 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "767a0495-90ff-412b-87da-a788808cda0e" (UID: "767a0495-90ff-412b-87da-a788808cda0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.506479 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.506522 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrndn\" (UniqueName: \"kubernetes.io/projected/767a0495-90ff-412b-87da-a788808cda0e-kube-api-access-mrndn\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.506558 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767a0495-90ff-412b-87da-a788808cda0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.545558 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.545915 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.546180 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:15 crc kubenswrapper[4948]: I1204 17:29:15.546426 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: E1204 17:29:16.073554 4948 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e1353b392f780 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 17:29:14.191779712 +0000 UTC m=+165.552854114,LastTimestamp:2025-12-04 17:29:14.191779712 +0000 UTC m=+165.552854114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.243849 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qw297" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.243856 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qw297" event={"ID":"767a0495-90ff-412b-87da-a788808cda0e","Type":"ContainerDied","Data":"8c302319bd6d9b860dd4b8b93b849298c31dbb0f5b394914c579c9ee9e00827c"} Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.243924 4948 scope.go:117] "RemoveContainer" containerID="c4aef8e3ca34ea978c9519d44d7fa2af6bc13a5f8e4ce8a1be45752597527294" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.244910 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.245446 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.245866 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.246331 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.248926 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.249754 4948 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b" exitCode=0 Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.264088 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.264526 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.264912 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.265511 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.274102 4948 scope.go:117] "RemoveContainer" containerID="9fb06d76bb26ee7cde43b223783f56093240e02fbc10e589573a903040d18c79" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.324054 4948 scope.go:117] "RemoveContainer" containerID="895288ee0f92b1776d4a032c4e6426f82738e10698ed9ac7b23de6081f6d8f1a" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.520950 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.522020 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.522528 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.522975 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.523271 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.703187 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.704977 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.705623 4948 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.706120 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.706563 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.706868 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.707204 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.723571 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-kubelet-dir\") pod \"78ea971c-3d85-48c2-8eed-04157dfa2f78\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.723673 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78ea971c-3d85-48c2-8eed-04157dfa2f78-kube-api-access\") pod \"78ea971c-3d85-48c2-8eed-04157dfa2f78\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.723713 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "78ea971c-3d85-48c2-8eed-04157dfa2f78" (UID: "78ea971c-3d85-48c2-8eed-04157dfa2f78"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.723790 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-var-lock\") pod \"78ea971c-3d85-48c2-8eed-04157dfa2f78\" (UID: \"78ea971c-3d85-48c2-8eed-04157dfa2f78\") " Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.723919 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-var-lock" (OuterVolumeSpecName: "var-lock") pod "78ea971c-3d85-48c2-8eed-04157dfa2f78" (UID: "78ea971c-3d85-48c2-8eed-04157dfa2f78"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.724129 4948 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.724150 4948 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78ea971c-3d85-48c2-8eed-04157dfa2f78-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.731601 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ea971c-3d85-48c2-8eed-04157dfa2f78-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "78ea971c-3d85-48c2-8eed-04157dfa2f78" (UID: "78ea971c-3d85-48c2-8eed-04157dfa2f78"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.824912 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.824963 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.825010 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.825132 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.825129 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.825217 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.825519 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78ea971c-3d85-48c2-8eed-04157dfa2f78-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.825538 4948 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.825549 4948 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.825561 4948 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:16 crc kubenswrapper[4948]: I1204 17:29:16.921106 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.258292 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.258321 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"78ea971c-3d85-48c2-8eed-04157dfa2f78","Type":"ContainerDied","Data":"7b3eca0953ee652440d3c6b74f5eb678df49ff02976f5b4e35b2eb65ce5804dd"} Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.258508 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b3eca0953ee652440d3c6b74f5eb678df49ff02976f5b4e35b2eb65ce5804dd" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.261861 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.262097 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.262312 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.262636 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.263299 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.265676 4948 scope.go:117] "RemoveContainer" containerID="310a6d1c7f66eb35bdde700b336d949e56614183d3db9d643a2574f651d54fd7" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.265749 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.266835 4948 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.267201 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.267929 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.268218 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.268438 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.268730 4948 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.268982 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.269287 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.269522 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.269744 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.280782 4948 scope.go:117] "RemoveContainer" containerID="085a56c6ec0da1223fbe699814093dc45f885695d8465311b3c2b71d177f0efe" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.294906 4948 scope.go:117] "RemoveContainer" containerID="909d3850374349d0caac03fd577f2f77ef9761161ce1846784d94ee3b4cbe38d" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.307090 4948 scope.go:117] "RemoveContainer" containerID="28591bb48ddfdc03afd4cced23304fb9e68b8333829f8408e7a9bea8e7ff13a5" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.321426 4948 scope.go:117] "RemoveContainer" containerID="3ccb318d06035bf63eae9101f9ed3d5bd04e033c0e945421f4173b48d35f254b" Dec 04 17:29:17 crc kubenswrapper[4948]: I1204 17:29:17.335743 4948 scope.go:117] "RemoveContainer" containerID="5a132a8e519cf9b115453ffd2a5e57e6702659bbb7f7eb46a99d6f7936a6115d" Dec 04 17:29:18 crc kubenswrapper[4948]: I1204 17:29:18.916132 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:18 crc kubenswrapper[4948]: I1204 17:29:18.916863 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:18 crc kubenswrapper[4948]: I1204 17:29:18.917155 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:18 crc kubenswrapper[4948]: I1204 17:29:18.917474 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:18 crc kubenswrapper[4948]: I1204 17:29:18.917636 4948 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:19 crc kubenswrapper[4948]: E1204 17:29:19.052886 4948 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:19 crc kubenswrapper[4948]: E1204 17:29:19.053368 4948 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:19 crc kubenswrapper[4948]: E1204 17:29:19.053665 4948 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:19 crc kubenswrapper[4948]: E1204 17:29:19.053896 4948 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:19 crc kubenswrapper[4948]: E1204 17:29:19.054248 4948 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:19 crc kubenswrapper[4948]: I1204 17:29:19.054296 4948 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 17:29:19 crc kubenswrapper[4948]: E1204 17:29:19.054663 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="200ms" Dec 04 17:29:19 crc kubenswrapper[4948]: E1204 17:29:19.255588 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="400ms" Dec 04 17:29:19 crc kubenswrapper[4948]: E1204 17:29:19.656534 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="800ms" Dec 04 17:29:20 crc kubenswrapper[4948]: E1204 17:29:20.457597 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="1.6s" Dec 04 17:29:22 crc kubenswrapper[4948]: E1204 17:29:22.058555 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="3.2s" Dec 04 17:29:25 crc kubenswrapper[4948]: E1204 17:29:25.259544 4948 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.212:6443: connect: connection refused" interval="6.4s" Dec 04 17:29:26 crc kubenswrapper[4948]: E1204 17:29:26.075130 4948 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.212:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e1353b392f780 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Created,Message:Created container startup-monitor,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 17:29:14.191779712 +0000 UTC m=+165.552854114,LastTimestamp:2025-12-04 17:29:14.191779712 +0000 UTC m=+165.552854114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 17:29:27 crc kubenswrapper[4948]: I1204 17:29:27.913785 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:27 crc kubenswrapper[4948]: I1204 17:29:27.914821 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:27 crc kubenswrapper[4948]: I1204 17:29:27.915346 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:27 crc kubenswrapper[4948]: I1204 17:29:27.915684 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:27 crc kubenswrapper[4948]: I1204 17:29:27.915985 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:27 crc kubenswrapper[4948]: I1204 17:29:27.929109 4948 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:27 crc kubenswrapper[4948]: I1204 17:29:27.929144 4948 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:27 crc kubenswrapper[4948]: E1204 17:29:27.929639 4948 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:27 crc kubenswrapper[4948]: I1204 17:29:27.930366 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:27 crc kubenswrapper[4948]: W1204 17:29:27.955794 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-59e4556c5a0c3416c8b176170e67f918f7b3228ee1ad393381ed6e0c7046e4f4 WatchSource:0}: Error finding container 59e4556c5a0c3416c8b176170e67f918f7b3228ee1ad393381ed6e0c7046e4f4: Status 404 returned error can't find the container with id 59e4556c5a0c3416c8b176170e67f918f7b3228ee1ad393381ed6e0c7046e4f4 Dec 04 17:29:28 crc kubenswrapper[4948]: I1204 17:29:28.426605 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"59e4556c5a0c3416c8b176170e67f918f7b3228ee1ad393381ed6e0c7046e4f4"} Dec 04 17:29:28 crc kubenswrapper[4948]: I1204 17:29:28.918634 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:28 crc kubenswrapper[4948]: I1204 17:29:28.919861 4948 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:28 crc kubenswrapper[4948]: I1204 17:29:28.920348 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:28 crc kubenswrapper[4948]: I1204 17:29:28.920844 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:28 crc kubenswrapper[4948]: I1204 17:29:28.921258 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.437591 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.437687 4948 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd" exitCode=1 Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.437824 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd"} Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.438742 4948 scope.go:117] "RemoveContainer" containerID="599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.438828 4948 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.440135 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.440763 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.441165 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.441653 4948 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.442017 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.450114 4948 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="667c7ba35fed4a98c7e353b572c1316a69ca46cf03329c3516eb5f01ac445c75" exitCode=0 Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.450178 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"667c7ba35fed4a98c7e353b572c1316a69ca46cf03329c3516eb5f01ac445c75"} Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.450686 4948 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.450752 4948 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:29 crc kubenswrapper[4948]: E1204 17:29:29.451511 4948 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.451629 4948 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.453248 4948 status_manager.go:851] "Failed to get status for pod" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.453529 4948 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.453804 4948 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.454093 4948 status_manager.go:851] "Failed to get status for pod" podUID="767a0495-90ff-412b-87da-a788808cda0e" pod="openshift-marketplace/certified-operators-qw297" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-qw297\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:29 crc kubenswrapper[4948]: I1204 17:29:29.454361 4948 status_manager.go:851] "Failed to get status for pod" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" pod="openshift-marketplace/community-operators-m8dlm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m8dlm\": dial tcp 38.102.83.212:6443: connect: connection refused" Dec 04 17:29:30 crc kubenswrapper[4948]: I1204 17:29:30.458009 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 17:29:30 crc kubenswrapper[4948]: I1204 17:29:30.458620 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c600aa0f03022aedbf65a6a8d24bd82b079d808e2cd087c1a594701244440166"} Dec 04 17:29:30 crc kubenswrapper[4948]: I1204 17:29:30.462182 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6bc551eb3a26532ac8478e2a7c6752401e0b4cdbd20a8982f33533e018936510"} Dec 04 17:29:30 crc kubenswrapper[4948]: I1204 17:29:30.462206 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"493142fbc2f0d65c206ab6f25c115d9c42f4a4c3c01259240a1e13487f3ffccf"} Dec 04 17:29:30 crc kubenswrapper[4948]: I1204 17:29:30.462215 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"69d96ef867336622424534cb26e5e800956cea8011b6d6308f6617b7fd248f19"} Dec 04 17:29:31 crc kubenswrapper[4948]: I1204 17:29:31.473323 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b1208b783c457b9a363b0ef0695683ca46b03781cb55ad9eeaca17a41c42248d"} Dec 04 17:29:31 crc kubenswrapper[4948]: I1204 17:29:31.473600 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:31 crc kubenswrapper[4948]: I1204 17:29:31.473611 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3d05a95d1137d66a34cc13dbf6012fe71fc0e83c0fb24008c6bb16d84a869748"} Dec 04 17:29:31 crc kubenswrapper[4948]: I1204 17:29:31.473568 4948 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:31 crc kubenswrapper[4948]: I1204 17:29:31.473629 4948 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:32 crc kubenswrapper[4948]: I1204 17:29:32.930580 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:32 crc kubenswrapper[4948]: I1204 17:29:32.930629 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:32 crc kubenswrapper[4948]: I1204 17:29:32.935942 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:33 crc kubenswrapper[4948]: I1204 17:29:33.283484 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:29:34 crc kubenswrapper[4948]: I1204 17:29:34.605115 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" podUID="62821e25-9412-4650-a9e0-34f4fe49656b" containerName="oauth-openshift" containerID="cri-o://813548feb85ed86684be112b00d9e592abdc413274bf21d3e2532a759e46104b" gracePeriod=15 Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.495375 4948 generic.go:334] "Generic (PLEG): container finished" podID="62821e25-9412-4650-a9e0-34f4fe49656b" containerID="813548feb85ed86684be112b00d9e592abdc413274bf21d3e2532a759e46104b" exitCode=0 Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.495472 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" event={"ID":"62821e25-9412-4650-a9e0-34f4fe49656b","Type":"ContainerDied","Data":"813548feb85ed86684be112b00d9e592abdc413274bf21d3e2532a759e46104b"} Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.495892 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" event={"ID":"62821e25-9412-4650-a9e0-34f4fe49656b","Type":"ContainerDied","Data":"21c21358c2662a628a6e9b2dd29795e1ba9a49d2e7bac9db4ebc7c6215473b7f"} Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.495995 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21c21358c2662a628a6e9b2dd29795e1ba9a49d2e7bac9db4ebc7c6215473b7f" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.496744 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.650876 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-serving-cert\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.650937 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-trusted-ca-bundle\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.650957 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-service-ca\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.650982 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-router-certs\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651007 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-provider-selection\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651037 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-session\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651077 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-cliconfig\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651095 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-error\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651127 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-ocp-branding-template\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651155 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62821e25-9412-4650-a9e0-34f4fe49656b-audit-dir\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651200 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-idp-0-file-data\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651220 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-login\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651247 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5rwb\" (UniqueName: \"kubernetes.io/projected/62821e25-9412-4650-a9e0-34f4fe49656b-kube-api-access-b5rwb\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.651269 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-audit-policies\") pod \"62821e25-9412-4650-a9e0-34f4fe49656b\" (UID: \"62821e25-9412-4650-a9e0-34f4fe49656b\") " Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.652334 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.652323 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.652643 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.653895 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.653984 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/62821e25-9412-4650-a9e0-34f4fe49656b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.657287 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.660697 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62821e25-9412-4650-a9e0-34f4fe49656b-kube-api-access-b5rwb" (OuterVolumeSpecName: "kube-api-access-b5rwb") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "kube-api-access-b5rwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.660747 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.661231 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.661474 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.662135 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.668450 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.670313 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.671804 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "62821e25-9412-4650-a9e0-34f4fe49656b" (UID: "62821e25-9412-4650-a9e0-34f4fe49656b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753087 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753136 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753158 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753178 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753201 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753219 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753238 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753258 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753305 4948 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62821e25-9412-4650-a9e0-34f4fe49656b-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753325 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753345 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753364 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5rwb\" (UniqueName: \"kubernetes.io/projected/62821e25-9412-4650-a9e0-34f4fe49656b-kube-api-access-b5rwb\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753382 4948 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62821e25-9412-4650-a9e0-34f4fe49656b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:35 crc kubenswrapper[4948]: I1204 17:29:35.753400 4948 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/62821e25-9412-4650-a9e0-34f4fe49656b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:29:36 crc kubenswrapper[4948]: I1204 17:29:36.484931 4948 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:36 crc kubenswrapper[4948]: I1204 17:29:36.503007 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2tl2h" Dec 04 17:29:36 crc kubenswrapper[4948]: I1204 17:29:36.504503 4948 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:36 crc kubenswrapper[4948]: I1204 17:29:36.504539 4948 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:36 crc kubenswrapper[4948]: I1204 17:29:36.509101 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:36 crc kubenswrapper[4948]: I1204 17:29:36.512668 4948 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="206c5fc7-62a1-4e39-abb0-4d3b7045cc8c" Dec 04 17:29:37 crc kubenswrapper[4948]: E1204 17:29:37.232988 4948 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Dec 04 17:29:37 crc kubenswrapper[4948]: E1204 17:29:37.281726 4948 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Dec 04 17:29:37 crc kubenswrapper[4948]: I1204 17:29:37.507901 4948 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:37 crc kubenswrapper[4948]: I1204 17:29:37.508160 4948 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5df3161c-11e8-460d-9c77-68d23acc9609" Dec 04 17:29:38 crc kubenswrapper[4948]: I1204 17:29:38.936546 4948 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="206c5fc7-62a1-4e39-abb0-4d3b7045cc8c" Dec 04 17:29:39 crc kubenswrapper[4948]: I1204 17:29:39.177563 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:29:39 crc kubenswrapper[4948]: I1204 17:29:39.177871 4948 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 17:29:39 crc kubenswrapper[4948]: I1204 17:29:39.178079 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 17:29:40 crc kubenswrapper[4948]: I1204 17:29:40.626952 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:29:40 crc kubenswrapper[4948]: I1204 17:29:40.627414 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:29:45 crc kubenswrapper[4948]: I1204 17:29:45.758903 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 17:29:46 crc kubenswrapper[4948]: I1204 17:29:46.353411 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 17:29:46 crc kubenswrapper[4948]: I1204 17:29:46.457752 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 17:29:46 crc kubenswrapper[4948]: I1204 17:29:46.910786 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 17:29:47 crc kubenswrapper[4948]: I1204 17:29:47.069071 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 17:29:47 crc kubenswrapper[4948]: I1204 17:29:47.134862 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 17:29:47 crc kubenswrapper[4948]: I1204 17:29:47.539481 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 17:29:47 crc kubenswrapper[4948]: I1204 17:29:47.671111 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 17:29:47 crc kubenswrapper[4948]: I1204 17:29:47.824309 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 17:29:47 crc kubenswrapper[4948]: I1204 17:29:47.897390 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 17:29:48 crc kubenswrapper[4948]: I1204 17:29:48.272545 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 17:29:48 crc kubenswrapper[4948]: I1204 17:29:48.329425 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 17:29:48 crc kubenswrapper[4948]: I1204 17:29:48.448764 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 17:29:48 crc kubenswrapper[4948]: I1204 17:29:48.450542 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 17:29:48 crc kubenswrapper[4948]: I1204 17:29:48.467539 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 17:29:48 crc kubenswrapper[4948]: I1204 17:29:48.577230 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 17:29:48 crc kubenswrapper[4948]: I1204 17:29:48.862813 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.006461 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.053693 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.111318 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.148283 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.181301 4948 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.181384 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.210663 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.243543 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.325097 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.441330 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.448916 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.527775 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.554425 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.580032 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.618976 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.646229 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.783943 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.816841 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.888745 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.940672 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.946832 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.958924 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.979119 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 17:29:49 crc kubenswrapper[4948]: I1204 17:29:49.999674 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.016938 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.053730 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.080123 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.212276 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.257382 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.335190 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.354741 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.368996 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.465917 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.601556 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.601681 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.613207 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.657283 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.691993 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.939481 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.953002 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 17:29:50 crc kubenswrapper[4948]: I1204 17:29:50.977867 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.051768 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.054346 4948 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.176999 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.194923 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.196192 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.211349 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.383822 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.627554 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.673328 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.749335 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.776548 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.847385 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.855825 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 17:29:51 crc kubenswrapper[4948]: I1204 17:29:51.931661 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.139149 4948 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.143003 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.142980754 podStartE2EDuration="40.142980754s" podCreationTimestamp="2025-12-04 17:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:29:36.112180292 +0000 UTC m=+187.473254714" watchObservedRunningTime="2025-12-04 17:29:52.142980754 +0000 UTC m=+203.504055166" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.144974 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/community-operators-m8dlm","openshift-marketplace/certified-operators-qw297","openshift-authentication/oauth-openshift-558db77b4-2tl2h"] Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.145254 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.149395 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.168026 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.168007402 podStartE2EDuration="16.168007402s" podCreationTimestamp="2025-12-04 17:29:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:29:52.165275218 +0000 UTC m=+203.526349650" watchObservedRunningTime="2025-12-04 17:29:52.168007402 +0000 UTC m=+203.529081814" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.191326 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.194418 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.219110 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.222020 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.226977 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.288750 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.345119 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.383107 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.468829 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.505941 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.633386 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.668858 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.687369 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.809308 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.852668 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.925655 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62821e25-9412-4650-a9e0-34f4fe49656b" path="/var/lib/kubelet/pods/62821e25-9412-4650-a9e0-34f4fe49656b/volumes" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.929738 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" path="/var/lib/kubelet/pods/74848112-8c60-4bcf-9f90-caee5c6e7f17/volumes" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.930837 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="767a0495-90ff-412b-87da-a788808cda0e" path="/var/lib/kubelet/pods/767a0495-90ff-412b-87da-a788808cda0e/volumes" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.975160 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 17:29:52 crc kubenswrapper[4948]: I1204 17:29:52.977612 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.011877 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.019754 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.045586 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.047837 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.048162 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.093836 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.113562 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.131711 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.147892 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.382012 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.443288 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.527745 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.570797 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.581146 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.587652 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.600013 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.666266 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.764356 4948 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.796388 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.801438 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.839591 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.841390 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.913194 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 17:29:53 crc kubenswrapper[4948]: I1204 17:29:53.951148 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.072395 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.124790 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.212471 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.231455 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.233997 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.254454 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.255239 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.413824 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.468505 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.564637 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.584037 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.680552 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.686263 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.694848 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.817142 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.853296 4948 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 17:29:54 crc kubenswrapper[4948]: I1204 17:29:54.932366 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.033345 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.183128 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.278867 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.280684 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.321074 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.373937 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.514473 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.522902 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.543544 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.583734 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.652370 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.677732 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.727514 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.879343 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.937110 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 17:29:55 crc kubenswrapper[4948]: I1204 17:29:55.989357 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.143796 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.153734 4948 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.194586 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.217986 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.333323 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.337331 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.355785 4948 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.358639 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.361372 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.409803 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.547644 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.583006 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.660453 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.735987 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.870838 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.879384 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.906710 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 17:29:56 crc kubenswrapper[4948]: I1204 17:29:56.922136 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.180295 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.256660 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.309417 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.406557 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.443562 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.655893 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.664998 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.688805 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.782956 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.790766 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.831547 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.832536 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.840802 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.854361 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.925979 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.939767 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.963653 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 17:29:57 crc kubenswrapper[4948]: I1204 17:29:57.975578 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.017093 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.019780 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.323852 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.327926 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.403686 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.408775 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.468728 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.563148 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.581085 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.630926 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.786426 4948 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.786703 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de" gracePeriod=5 Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.854096 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.874791 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.918658 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.923869 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.924614 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.944623 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 17:29:58 crc kubenswrapper[4948]: I1204 17:29:58.946682 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.012687 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.123519 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.178781 4948 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.178863 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.178944 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.179612 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c600aa0f03022aedbf65a6a8d24bd82b079d808e2cd087c1a594701244440166"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.179785 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://c600aa0f03022aedbf65a6a8d24bd82b079d808e2cd087c1a594701244440166" gracePeriod=30 Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.250707 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.306266 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.323501 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.440867 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.515735 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.949911 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 17:29:59 crc kubenswrapper[4948]: I1204 17:29:59.972298 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.003838 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.063287 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.069394 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57d8db7c67-6xnp9"] Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.069738 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerName="extract-utilities" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.069778 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerName="extract-utilities" Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.069805 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerName="extract-content" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.069826 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerName="extract-content" Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.069851 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerName="registry-server" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.069868 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerName="registry-server" Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.069889 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="extract-content" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.069905 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="extract-content" Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.069935 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" containerName="installer" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.069951 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" containerName="installer" Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.069976 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="registry-server" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.069991 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="registry-server" Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.070028 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="extract-utilities" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.070080 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="extract-utilities" Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.070099 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62821e25-9412-4650-a9e0-34f4fe49656b" containerName="oauth-openshift" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.070115 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="62821e25-9412-4650-a9e0-34f4fe49656b" containerName="oauth-openshift" Dec 04 17:30:00 crc kubenswrapper[4948]: E1204 17:30:00.070136 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.070152 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.070319 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="74848112-8c60-4bcf-9f90-caee5c6e7f17" containerName="registry-server" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.070344 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="62821e25-9412-4650-a9e0-34f4fe49656b" containerName="oauth-openshift" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.070432 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ea971c-3d85-48c2-8eed-04157dfa2f78" containerName="installer" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.070454 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="767a0495-90ff-412b-87da-a788808cda0e" containerName="registry-server" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.070479 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.071251 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.074226 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.074817 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.075016 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.075140 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.075180 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.075144 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.075304 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.075432 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.075919 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.076090 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.079458 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.080510 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.093175 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.094482 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.107428 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57d8db7c67-6xnp9"] Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.112610 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172642 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-error\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172685 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172707 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172769 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-router-certs\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172792 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172824 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172844 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05567ffb-0d57-4bda-935b-0edb6d0b6c85-audit-dir\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172869 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172894 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-service-ca\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172912 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjq42\" (UniqueName: \"kubernetes.io/projected/05567ffb-0d57-4bda-935b-0edb6d0b6c85-kube-api-access-wjq42\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172945 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172967 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-login\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.172988 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-session\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.173057 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-audit-policies\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.211095 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274152 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-service-ca\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274212 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274238 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjq42\" (UniqueName: \"kubernetes.io/projected/05567ffb-0d57-4bda-935b-0edb6d0b6c85-kube-api-access-wjq42\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274273 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-login\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274300 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-session\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274323 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-audit-policies\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274349 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-error\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274373 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274398 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274426 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-router-certs\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274451 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274485 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274514 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05567ffb-0d57-4bda-935b-0edb6d0b6c85-audit-dir\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.274541 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.275346 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-service-ca\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.275803 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.276373 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/05567ffb-0d57-4bda-935b-0edb6d0b6c85-audit-dir\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.276552 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.276766 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/05567ffb-0d57-4bda-935b-0edb6d0b6c85-audit-policies\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.280498 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.280581 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-router-certs\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.280624 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.281111 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.281395 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-system-session\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.282091 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-error\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.290693 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-login\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.291009 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/05567ffb-0d57-4bda-935b-0edb6d0b6c85-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.297576 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjq42\" (UniqueName: \"kubernetes.io/projected/05567ffb-0d57-4bda-935b-0edb6d0b6c85-kube-api-access-wjq42\") pod \"oauth-openshift-57d8db7c67-6xnp9\" (UID: \"05567ffb-0d57-4bda-935b-0edb6d0b6c85\") " pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.378729 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.394628 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.397194 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.425218 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.619747 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57d8db7c67-6xnp9"] Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.632291 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.641082 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" event={"ID":"05567ffb-0d57-4bda-935b-0edb6d0b6c85","Type":"ContainerStarted","Data":"72571f3dc4fd0a45c507915d69e0aeacfdb694f373655721481a7593ca2c28a1"} Dec 04 17:30:00 crc kubenswrapper[4948]: I1204 17:30:00.866984 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.027947 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.114406 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.143014 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.167263 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.174242 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.341530 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.354809 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.383498 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.424659 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.437886 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.612083 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.647804 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" event={"ID":"05567ffb-0d57-4bda-935b-0edb6d0b6c85","Type":"ContainerStarted","Data":"012c84af5506b6c0ea867386055cfbc3391ea70736ba012985040f9a9229216a"} Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.648217 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.661701 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.667942 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57d8db7c67-6xnp9" podStartSLOduration=52.667922543 podStartE2EDuration="52.667922543s" podCreationTimestamp="2025-12-04 17:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:30:01.665916571 +0000 UTC m=+213.026990993" watchObservedRunningTime="2025-12-04 17:30:01.667922543 +0000 UTC m=+213.028996945" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.718345 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 17:30:01 crc kubenswrapper[4948]: I1204 17:30:01.926082 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 17:30:02 crc kubenswrapper[4948]: I1204 17:30:02.045814 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 17:30:02 crc kubenswrapper[4948]: I1204 17:30:02.340910 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 17:30:02 crc kubenswrapper[4948]: I1204 17:30:02.501067 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 17:30:02 crc kubenswrapper[4948]: I1204 17:30:02.567524 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.434947 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.435031 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632411 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632606 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632662 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632699 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632737 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632749 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632780 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632823 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.632826 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.633279 4948 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.633302 4948 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.633315 4948 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.633326 4948 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.641370 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.667782 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.667869 4948 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de" exitCode=137 Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.667909 4948 scope.go:117] "RemoveContainer" containerID="ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.668238 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.687088 4948 scope.go:117] "RemoveContainer" containerID="ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de" Dec 04 17:30:04 crc kubenswrapper[4948]: E1204 17:30:04.687543 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de\": container with ID starting with ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de not found: ID does not exist" containerID="ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.687585 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de"} err="failed to get container status \"ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de\": rpc error: code = NotFound desc = could not find container \"ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de\": container with ID starting with ca4675b5fb81fcc06d8de9d2fa8ecc6d45c31f2fddc651c7f5feacca7307c6de not found: ID does not exist" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.734204 4948 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.924753 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.925351 4948 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.942866 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.942937 4948 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="7a5e9a4b-d9b2-4664-9a98-75d69f3727b3" Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.949738 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 17:30:04 crc kubenswrapper[4948]: I1204 17:30:04.949778 4948 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="7a5e9a4b-d9b2-4664-9a98-75d69f3727b3" Dec 04 17:30:10 crc kubenswrapper[4948]: I1204 17:30:10.625260 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:30:10 crc kubenswrapper[4948]: I1204 17:30:10.625351 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:30:10 crc kubenswrapper[4948]: I1204 17:30:10.625417 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:30:10 crc kubenswrapper[4948]: I1204 17:30:10.626218 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e811b8000b0a1451742559953ae4b8ceaef08af55bb4663a9967a43362e5d3b"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:30:10 crc kubenswrapper[4948]: I1204 17:30:10.626321 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://8e811b8000b0a1451742559953ae4b8ceaef08af55bb4663a9967a43362e5d3b" gracePeriod=600 Dec 04 17:30:12 crc kubenswrapper[4948]: I1204 17:30:12.709412 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="8e811b8000b0a1451742559953ae4b8ceaef08af55bb4663a9967a43362e5d3b" exitCode=0 Dec 04 17:30:12 crc kubenswrapper[4948]: I1204 17:30:12.709503 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"8e811b8000b0a1451742559953ae4b8ceaef08af55bb4663a9967a43362e5d3b"} Dec 04 17:30:13 crc kubenswrapper[4948]: I1204 17:30:13.719234 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"d1115719e47aa6bf1c1453a2c9bdd06db75016c207034cc5d723bbc4c3177a31"} Dec 04 17:30:21 crc kubenswrapper[4948]: I1204 17:30:21.514408 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 17:30:29 crc kubenswrapper[4948]: I1204 17:30:29.720578 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 17:30:29 crc kubenswrapper[4948]: I1204 17:30:29.751245 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 17:30:29 crc kubenswrapper[4948]: I1204 17:30:29.830821 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 04 17:30:29 crc kubenswrapper[4948]: I1204 17:30:29.832382 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 17:30:29 crc kubenswrapper[4948]: I1204 17:30:29.832411 4948 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c600aa0f03022aedbf65a6a8d24bd82b079d808e2cd087c1a594701244440166" exitCode=137 Dec 04 17:30:29 crc kubenswrapper[4948]: I1204 17:30:29.832436 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c600aa0f03022aedbf65a6a8d24bd82b079d808e2cd087c1a594701244440166"} Dec 04 17:30:29 crc kubenswrapper[4948]: I1204 17:30:29.832465 4948 scope.go:117] "RemoveContainer" containerID="599ade09339a6ac0e1f3204fe402337b5e194af68440b50f959904faa2ca6fcd" Dec 04 17:30:30 crc kubenswrapper[4948]: I1204 17:30:30.850156 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 04 17:30:30 crc kubenswrapper[4948]: I1204 17:30:30.851407 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7f9779beef953309c4be95f03d0845cda2641708b90e06109a7f5c84be363af0"} Dec 04 17:30:32 crc kubenswrapper[4948]: I1204 17:30:32.865157 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 17:30:33 crc kubenswrapper[4948]: I1204 17:30:33.284087 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:30:36 crc kubenswrapper[4948]: I1204 17:30:36.439701 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 17:30:39 crc kubenswrapper[4948]: I1204 17:30:39.177829 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:30:39 crc kubenswrapper[4948]: I1204 17:30:39.182525 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:30:43 crc kubenswrapper[4948]: I1204 17:30:43.289077 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.357520 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rz5j4"] Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.358174 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" podUID="cad76813-b0e7-4c9c-86e9-44d797f5dbb9" containerName="controller-manager" containerID="cri-o://8fca269ec606ff95a504d12038c04488edeaf640be2bd7c43d146294d142ffea" gracePeriod=30 Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.364020 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp"] Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.364275 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" podUID="771c1e0f-69a0-4bf2-8345-37ed755de8ff" containerName="route-controller-manager" containerID="cri-o://92fba8e6e64c770e364960949f60a9af9e32f6354376c16ae71909d7f0aa34f3" gracePeriod=30 Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.498461 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk"] Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.499295 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.504192 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.504767 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.518087 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk"] Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.591668 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h8lr\" (UniqueName: \"kubernetes.io/projected/3e652828-b844-4fa8-8e4d-54726614f646-kube-api-access-8h8lr\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.591726 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e652828-b844-4fa8-8e4d-54726614f646-secret-volume\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.591874 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e652828-b844-4fa8-8e4d-54726614f646-config-volume\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.692931 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h8lr\" (UniqueName: \"kubernetes.io/projected/3e652828-b844-4fa8-8e4d-54726614f646-kube-api-access-8h8lr\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.693002 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e652828-b844-4fa8-8e4d-54726614f646-secret-volume\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.693078 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e652828-b844-4fa8-8e4d-54726614f646-config-volume\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.694089 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e652828-b844-4fa8-8e4d-54726614f646-config-volume\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.698667 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e652828-b844-4fa8-8e4d-54726614f646-secret-volume\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.709073 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h8lr\" (UniqueName: \"kubernetes.io/projected/3e652828-b844-4fa8-8e4d-54726614f646-kube-api-access-8h8lr\") pod \"collect-profiles-29414490-fzmqk\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.813132 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.939113 4948 generic.go:334] "Generic (PLEG): container finished" podID="771c1e0f-69a0-4bf2-8345-37ed755de8ff" containerID="92fba8e6e64c770e364960949f60a9af9e32f6354376c16ae71909d7f0aa34f3" exitCode=0 Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.939388 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" event={"ID":"771c1e0f-69a0-4bf2-8345-37ed755de8ff","Type":"ContainerDied","Data":"92fba8e6e64c770e364960949f60a9af9e32f6354376c16ae71909d7f0aa34f3"} Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.941146 4948 generic.go:334] "Generic (PLEG): container finished" podID="cad76813-b0e7-4c9c-86e9-44d797f5dbb9" containerID="8fca269ec606ff95a504d12038c04488edeaf640be2bd7c43d146294d142ffea" exitCode=0 Dec 04 17:30:46 crc kubenswrapper[4948]: I1204 17:30:46.941200 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" event={"ID":"cad76813-b0e7-4c9c-86e9-44d797f5dbb9","Type":"ContainerDied","Data":"8fca269ec606ff95a504d12038c04488edeaf640be2bd7c43d146294d142ffea"} Dec 04 17:30:47 crc kubenswrapper[4948]: I1204 17:30:47.180910 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk"] Dec 04 17:30:47 crc kubenswrapper[4948]: I1204 17:30:47.349939 4948 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-m2wdp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 04 17:30:47 crc kubenswrapper[4948]: I1204 17:30:47.350003 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" podUID="771c1e0f-69a0-4bf2-8345-37ed755de8ff" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 04 17:30:47 crc kubenswrapper[4948]: I1204 17:30:47.947913 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" event={"ID":"3e652828-b844-4fa8-8e4d-54726614f646","Type":"ContainerStarted","Data":"9f5a8b36a2fb3a1530c4a5164a80ac28d59af38f94322329652d3ca502b1ae76"} Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.585264 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.606774 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj"] Dec 04 17:30:48 crc kubenswrapper[4948]: E1204 17:30:48.607022 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cad76813-b0e7-4c9c-86e9-44d797f5dbb9" containerName="controller-manager" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.607060 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad76813-b0e7-4c9c-86e9-44d797f5dbb9" containerName="controller-manager" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.607186 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="cad76813-b0e7-4c9c-86e9-44d797f5dbb9" containerName="controller-manager" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.607701 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.619589 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj"] Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.716890 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-client-ca\") pod \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.716948 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-serving-cert\") pod \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.716972 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-config\") pod \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.716989 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-proxy-ca-bundles\") pod \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.717031 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dwn4\" (UniqueName: \"kubernetes.io/projected/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-kube-api-access-4dwn4\") pod \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\" (UID: \"cad76813-b0e7-4c9c-86e9-44d797f5dbb9\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.717195 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92f2n\" (UniqueName: \"kubernetes.io/projected/f4f479a3-49bb-4ea2-a784-47f537a251d6-kube-api-access-92f2n\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.717230 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-config\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.717281 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-client-ca\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.717303 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4f479a3-49bb-4ea2-a784-47f537a251d6-serving-cert\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.717325 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-proxy-ca-bundles\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.718172 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-client-ca" (OuterVolumeSpecName: "client-ca") pod "cad76813-b0e7-4c9c-86e9-44d797f5dbb9" (UID: "cad76813-b0e7-4c9c-86e9-44d797f5dbb9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.719056 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cad76813-b0e7-4c9c-86e9-44d797f5dbb9" (UID: "cad76813-b0e7-4c9c-86e9-44d797f5dbb9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.719094 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-config" (OuterVolumeSpecName: "config") pod "cad76813-b0e7-4c9c-86e9-44d797f5dbb9" (UID: "cad76813-b0e7-4c9c-86e9-44d797f5dbb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.722964 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-kube-api-access-4dwn4" (OuterVolumeSpecName: "kube-api-access-4dwn4") pod "cad76813-b0e7-4c9c-86e9-44d797f5dbb9" (UID: "cad76813-b0e7-4c9c-86e9-44d797f5dbb9"). InnerVolumeSpecName "kube-api-access-4dwn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.722966 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cad76813-b0e7-4c9c-86e9-44d797f5dbb9" (UID: "cad76813-b0e7-4c9c-86e9-44d797f5dbb9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.754953 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.818678 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-proxy-ca-bundles\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.818750 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92f2n\" (UniqueName: \"kubernetes.io/projected/f4f479a3-49bb-4ea2-a784-47f537a251d6-kube-api-access-92f2n\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.818797 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-config\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.818894 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-client-ca\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.818919 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4f479a3-49bb-4ea2-a784-47f537a251d6-serving-cert\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.818964 4948 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.818976 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.818987 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.819001 4948 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.819013 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dwn4\" (UniqueName: \"kubernetes.io/projected/cad76813-b0e7-4c9c-86e9-44d797f5dbb9-kube-api-access-4dwn4\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.820360 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-client-ca\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.820657 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-proxy-ca-bundles\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.820678 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-config\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.823101 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4f479a3-49bb-4ea2-a784-47f537a251d6-serving-cert\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.836729 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92f2n\" (UniqueName: \"kubernetes.io/projected/f4f479a3-49bb-4ea2-a784-47f537a251d6-kube-api-access-92f2n\") pod \"controller-manager-cdbdfd9bd-7fvbj\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.919567 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-config\") pod \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.919637 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnrt6\" (UniqueName: \"kubernetes.io/projected/771c1e0f-69a0-4bf2-8345-37ed755de8ff-kube-api-access-lnrt6\") pod \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.919667 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-client-ca\") pod \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.919696 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771c1e0f-69a0-4bf2-8345-37ed755de8ff-serving-cert\") pod \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\" (UID: \"771c1e0f-69a0-4bf2-8345-37ed755de8ff\") " Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.920538 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "771c1e0f-69a0-4bf2-8345-37ed755de8ff" (UID: "771c1e0f-69a0-4bf2-8345-37ed755de8ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.920553 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-config" (OuterVolumeSpecName: "config") pod "771c1e0f-69a0-4bf2-8345-37ed755de8ff" (UID: "771c1e0f-69a0-4bf2-8345-37ed755de8ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.923658 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/771c1e0f-69a0-4bf2-8345-37ed755de8ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "771c1e0f-69a0-4bf2-8345-37ed755de8ff" (UID: "771c1e0f-69a0-4bf2-8345-37ed755de8ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.923756 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/771c1e0f-69a0-4bf2-8345-37ed755de8ff-kube-api-access-lnrt6" (OuterVolumeSpecName: "kube-api-access-lnrt6") pod "771c1e0f-69a0-4bf2-8345-37ed755de8ff" (UID: "771c1e0f-69a0-4bf2-8345-37ed755de8ff"). InnerVolumeSpecName "kube-api-access-lnrt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.927381 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.953481 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" event={"ID":"771c1e0f-69a0-4bf2-8345-37ed755de8ff","Type":"ContainerDied","Data":"6d56fc569bdb59ff329acbc4acb74a0ed2382cb76d25007d5e021e6916dc308f"} Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.954034 4948 scope.go:117] "RemoveContainer" containerID="92fba8e6e64c770e364960949f60a9af9e32f6354376c16ae71909d7f0aa34f3" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.953747 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.955403 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" event={"ID":"3e652828-b844-4fa8-8e4d-54726614f646","Type":"ContainerStarted","Data":"71402d013c6c0349294925fbd5f06e2b6982a5197a16e9fa95fe6c800ba75efa"} Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.958732 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" event={"ID":"cad76813-b0e7-4c9c-86e9-44d797f5dbb9","Type":"ContainerDied","Data":"4f13afb2157b129fcb89d1d5bd75562a15dab8d4e97b76ccdf436ea868b3c675"} Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.958803 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-rz5j4" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.969719 4948 scope.go:117] "RemoveContainer" containerID="8fca269ec606ff95a504d12038c04488edeaf640be2bd7c43d146294d142ffea" Dec 04 17:30:48 crc kubenswrapper[4948]: I1204 17:30:48.997682 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" podStartSLOduration=2.997616829 podStartE2EDuration="2.997616829s" podCreationTimestamp="2025-12-04 17:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:30:48.97874077 +0000 UTC m=+260.339815182" watchObservedRunningTime="2025-12-04 17:30:48.997616829 +0000 UTC m=+260.358691261" Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.007222 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rz5j4"] Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.013576 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-rz5j4"] Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.024075 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp"] Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.025607 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.025653 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnrt6\" (UniqueName: \"kubernetes.io/projected/771c1e0f-69a0-4bf2-8345-37ed755de8ff-kube-api-access-lnrt6\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.025679 4948 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/771c1e0f-69a0-4bf2-8345-37ed755de8ff-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.025705 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771c1e0f-69a0-4bf2-8345-37ed755de8ff-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.029461 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-m2wdp"] Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.114615 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj"] Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.965865 4948 generic.go:334] "Generic (PLEG): container finished" podID="3e652828-b844-4fa8-8e4d-54726614f646" containerID="71402d013c6c0349294925fbd5f06e2b6982a5197a16e9fa95fe6c800ba75efa" exitCode=0 Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.965965 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" event={"ID":"3e652828-b844-4fa8-8e4d-54726614f646","Type":"ContainerDied","Data":"71402d013c6c0349294925fbd5f06e2b6982a5197a16e9fa95fe6c800ba75efa"} Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.968811 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" event={"ID":"f4f479a3-49bb-4ea2-a784-47f537a251d6","Type":"ContainerStarted","Data":"2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee"} Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.968849 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" event={"ID":"f4f479a3-49bb-4ea2-a784-47f537a251d6","Type":"ContainerStarted","Data":"e35f7475de1179317510ace39597a896b159871f2bfd415e16733cb9c37a9a52"} Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.969388 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:49 crc kubenswrapper[4948]: I1204 17:30:49.974265 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:30:50 crc kubenswrapper[4948]: I1204 17:30:50.007419 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" podStartSLOduration=4.007401996 podStartE2EDuration="4.007401996s" podCreationTimestamp="2025-12-04 17:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:30:50.004373813 +0000 UTC m=+261.365448225" watchObservedRunningTime="2025-12-04 17:30:50.007401996 +0000 UTC m=+261.368476408" Dec 04 17:30:50 crc kubenswrapper[4948]: I1204 17:30:50.927664 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="771c1e0f-69a0-4bf2-8345-37ed755de8ff" path="/var/lib/kubelet/pods/771c1e0f-69a0-4bf2-8345-37ed755de8ff/volumes" Dec 04 17:30:50 crc kubenswrapper[4948]: I1204 17:30:50.930294 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad76813-b0e7-4c9c-86e9-44d797f5dbb9" path="/var/lib/kubelet/pods/cad76813-b0e7-4c9c-86e9-44d797f5dbb9/volumes" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.116193 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb"] Dec 04 17:30:51 crc kubenswrapper[4948]: E1204 17:30:51.116548 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771c1e0f-69a0-4bf2-8345-37ed755de8ff" containerName="route-controller-manager" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.116564 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="771c1e0f-69a0-4bf2-8345-37ed755de8ff" containerName="route-controller-manager" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.116815 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="771c1e0f-69a0-4bf2-8345-37ed755de8ff" containerName="route-controller-manager" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.117624 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.121877 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.121904 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb"] Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.122090 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.122157 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.122249 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.122407 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.122445 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.205703 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.252617 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf79359-9ae9-422d-9b67-8dc7bba891f1-serving-cert\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.252710 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4475\" (UniqueName: \"kubernetes.io/projected/bcf79359-9ae9-422d-9b67-8dc7bba891f1-kube-api-access-m4475\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.252748 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-client-ca\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.252767 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-config\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.354734 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e652828-b844-4fa8-8e4d-54726614f646-secret-volume\") pod \"3e652828-b844-4fa8-8e4d-54726614f646\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.355251 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e652828-b844-4fa8-8e4d-54726614f646-config-volume\") pod \"3e652828-b844-4fa8-8e4d-54726614f646\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.355537 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h8lr\" (UniqueName: \"kubernetes.io/projected/3e652828-b844-4fa8-8e4d-54726614f646-kube-api-access-8h8lr\") pod \"3e652828-b844-4fa8-8e4d-54726614f646\" (UID: \"3e652828-b844-4fa8-8e4d-54726614f646\") " Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.356108 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e652828-b844-4fa8-8e4d-54726614f646-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e652828-b844-4fa8-8e4d-54726614f646" (UID: "3e652828-b844-4fa8-8e4d-54726614f646"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.356370 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4475\" (UniqueName: \"kubernetes.io/projected/bcf79359-9ae9-422d-9b67-8dc7bba891f1-kube-api-access-m4475\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.356466 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-client-ca\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.356512 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-config\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.356573 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf79359-9ae9-422d-9b67-8dc7bba891f1-serving-cert\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.356660 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e652828-b844-4fa8-8e4d-54726614f646-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.357898 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-client-ca\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.358148 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-config\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.365404 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e652828-b844-4fa8-8e4d-54726614f646-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e652828-b844-4fa8-8e4d-54726614f646" (UID: "3e652828-b844-4fa8-8e4d-54726614f646"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.365433 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e652828-b844-4fa8-8e4d-54726614f646-kube-api-access-8h8lr" (OuterVolumeSpecName: "kube-api-access-8h8lr") pod "3e652828-b844-4fa8-8e4d-54726614f646" (UID: "3e652828-b844-4fa8-8e4d-54726614f646"). InnerVolumeSpecName "kube-api-access-8h8lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.368135 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf79359-9ae9-422d-9b67-8dc7bba891f1-serving-cert\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.375542 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4475\" (UniqueName: \"kubernetes.io/projected/bcf79359-9ae9-422d-9b67-8dc7bba891f1-kube-api-access-m4475\") pod \"route-controller-manager-78cb5fd749-8fhwb\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.438246 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.457918 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h8lr\" (UniqueName: \"kubernetes.io/projected/3e652828-b844-4fa8-8e4d-54726614f646-kube-api-access-8h8lr\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.457964 4948 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e652828-b844-4fa8-8e4d-54726614f646-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.630364 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb"] Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.991896 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" event={"ID":"3e652828-b844-4fa8-8e4d-54726614f646","Type":"ContainerDied","Data":"9f5a8b36a2fb3a1530c4a5164a80ac28d59af38f94322329652d3ca502b1ae76"} Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.991932 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f5a8b36a2fb3a1530c4a5164a80ac28d59af38f94322329652d3ca502b1ae76" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.991942 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk" Dec 04 17:30:51 crc kubenswrapper[4948]: I1204 17:30:51.993985 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" event={"ID":"bcf79359-9ae9-422d-9b67-8dc7bba891f1","Type":"ContainerStarted","Data":"9198d6730780a50cc5b441ae9ea6c56852693b60c351f421175b39c7c0fd6017"} Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.000068 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" event={"ID":"bcf79359-9ae9-422d-9b67-8dc7bba891f1","Type":"ContainerStarted","Data":"bb8aec7d9929b01ab7d19192ce372f31f424d8ec3e2546f4c64596094fbf66e4"} Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.341831 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77jch"] Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.342142 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-77jch" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerName="registry-server" containerID="cri-o://77d8a8f65056279c810b1a6c249fc35052efd8d27e3c16327f9fc6e70ecab2f6" gracePeriod=30 Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.365268 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-22gwb"] Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.365526 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-22gwb" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerName="registry-server" containerID="cri-o://c3eaa4af95baccf8f31eb1654650c547d663fbdc728faf99613a4463715487d7" gracePeriod=30 Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.377230 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hbqk5"] Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.377507 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" podUID="5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" containerName="marketplace-operator" containerID="cri-o://9a3935c90d9b28eae221025b6929ffc89ee5b1e51b6801c8da2ecd6cf10b3db1" gracePeriod=30 Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.389990 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jw8ps"] Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.390234 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jw8ps" podUID="fc2914f1-50b7-4a3a-902e-000091874005" containerName="registry-server" containerID="cri-o://a898f9fce9fab0e97ceda44d90f84a5d3d5abc7ea3d7144b9d59aabfa934b25d" gracePeriod=30 Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.395350 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l48pp"] Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.395790 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l48pp" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="registry-server" containerID="cri-o://ae16fa2c605d20dd73cc12faf0c919c48f07d7430c67acb9feb5c7674881d65a" gracePeriod=30 Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.408478 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-526nd"] Dec 04 17:30:53 crc kubenswrapper[4948]: E1204 17:30:53.408725 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e652828-b844-4fa8-8e4d-54726614f646" containerName="collect-profiles" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.408741 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e652828-b844-4fa8-8e4d-54726614f646" containerName="collect-profiles" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.408832 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e652828-b844-4fa8-8e4d-54726614f646" containerName="collect-profiles" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.409213 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.419782 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-526nd"] Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.496324 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q9hb\" (UniqueName: \"kubernetes.io/projected/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-kube-api-access-2q9hb\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.496676 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.496717 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.598027 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q9hb\" (UniqueName: \"kubernetes.io/projected/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-kube-api-access-2q9hb\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.598096 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.598119 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.599399 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.603925 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.613886 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q9hb\" (UniqueName: \"kubernetes.io/projected/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-kube-api-access-2q9hb\") pod \"marketplace-operator-79b997595-526nd\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.734448 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:53 crc kubenswrapper[4948]: I1204 17:30:53.929064 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-526nd"] Dec 04 17:30:53 crc kubenswrapper[4948]: W1204 17:30:53.934774 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93aadeae_ed9a_4dc5_8151_1d3b5fa9d691.slice/crio-1b803e9cd11212a2b3de968464f151d186a9a9b3853911b71086ad1111101806 WatchSource:0}: Error finding container 1b803e9cd11212a2b3de968464f151d186a9a9b3853911b71086ad1111101806: Status 404 returned error can't find the container with id 1b803e9cd11212a2b3de968464f151d186a9a9b3853911b71086ad1111101806 Dec 04 17:30:54 crc kubenswrapper[4948]: I1204 17:30:54.005707 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" event={"ID":"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691","Type":"ContainerStarted","Data":"1b803e9cd11212a2b3de968464f151d186a9a9b3853911b71086ad1111101806"} Dec 04 17:30:54 crc kubenswrapper[4948]: I1204 17:30:54.006121 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:54 crc kubenswrapper[4948]: I1204 17:30:54.010892 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:30:54 crc kubenswrapper[4948]: I1204 17:30:54.021120 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" podStartSLOduration=8.021101078 podStartE2EDuration="8.021101078s" podCreationTimestamp="2025-12-04 17:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:30:54.020731758 +0000 UTC m=+265.381806180" watchObservedRunningTime="2025-12-04 17:30:54.021101078 +0000 UTC m=+265.382175480" Dec 04 17:30:55 crc kubenswrapper[4948]: I1204 17:30:55.018200 4948 generic.go:334] "Generic (PLEG): container finished" podID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerID="c3eaa4af95baccf8f31eb1654650c547d663fbdc728faf99613a4463715487d7" exitCode=0 Dec 04 17:30:55 crc kubenswrapper[4948]: I1204 17:30:55.018311 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22gwb" event={"ID":"09f28c0e-7133-4236-9614-fe2fe6b5e2e2","Type":"ContainerDied","Data":"c3eaa4af95baccf8f31eb1654650c547d663fbdc728faf99613a4463715487d7"} Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.026090 4948 generic.go:334] "Generic (PLEG): container finished" podID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerID="77d8a8f65056279c810b1a6c249fc35052efd8d27e3c16327f9fc6e70ecab2f6" exitCode=0 Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.026331 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77jch" event={"ID":"18aaaacf-fb8c-4ba8-ab03-b89ec705114b","Type":"ContainerDied","Data":"77d8a8f65056279c810b1a6c249fc35052efd8d27e3c16327f9fc6e70ecab2f6"} Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.030945 4948 generic.go:334] "Generic (PLEG): container finished" podID="cc8a9450-7e86-4194-962d-566fee4563df" containerID="ae16fa2c605d20dd73cc12faf0c919c48f07d7430c67acb9feb5c7674881d65a" exitCode=0 Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.031018 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l48pp" event={"ID":"cc8a9450-7e86-4194-962d-566fee4563df","Type":"ContainerDied","Data":"ae16fa2c605d20dd73cc12faf0c919c48f07d7430c67acb9feb5c7674881d65a"} Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.032930 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" event={"ID":"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691","Type":"ContainerStarted","Data":"479097f7d565e8284acd2ac5d3af8e34377b8d4d864ce9f481ddb1743bba1e75"} Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.034728 4948 generic.go:334] "Generic (PLEG): container finished" podID="5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" containerID="9a3935c90d9b28eae221025b6929ffc89ee5b1e51b6801c8da2ecd6cf10b3db1" exitCode=0 Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.034777 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" event={"ID":"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf","Type":"ContainerDied","Data":"9a3935c90d9b28eae221025b6929ffc89ee5b1e51b6801c8da2ecd6cf10b3db1"} Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.036687 4948 generic.go:334] "Generic (PLEG): container finished" podID="fc2914f1-50b7-4a3a-902e-000091874005" containerID="a898f9fce9fab0e97ceda44d90f84a5d3d5abc7ea3d7144b9d59aabfa934b25d" exitCode=0 Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.036717 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jw8ps" event={"ID":"fc2914f1-50b7-4a3a-902e-000091874005","Type":"ContainerDied","Data":"a898f9fce9fab0e97ceda44d90f84a5d3d5abc7ea3d7144b9d59aabfa934b25d"} Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.124698 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.231548 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-catalog-content\") pod \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.231586 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9kq9\" (UniqueName: \"kubernetes.io/projected/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-kube-api-access-x9kq9\") pod \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.231686 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-utilities\") pod \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\" (UID: \"09f28c0e-7133-4236-9614-fe2fe6b5e2e2\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.233467 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.235439 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-utilities" (OuterVolumeSpecName: "utilities") pod "09f28c0e-7133-4236-9614-fe2fe6b5e2e2" (UID: "09f28c0e-7133-4236-9614-fe2fe6b5e2e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.243354 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-kube-api-access-x9kq9" (OuterVolumeSpecName: "kube-api-access-x9kq9") pod "09f28c0e-7133-4236-9614-fe2fe6b5e2e2" (UID: "09f28c0e-7133-4236-9614-fe2fe6b5e2e2"). InnerVolumeSpecName "kube-api-access-x9kq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.300357 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09f28c0e-7133-4236-9614-fe2fe6b5e2e2" (UID: "09f28c0e-7133-4236-9614-fe2fe6b5e2e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.333114 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt6gp\" (UniqueName: \"kubernetes.io/projected/fc2914f1-50b7-4a3a-902e-000091874005-kube-api-access-mt6gp\") pod \"fc2914f1-50b7-4a3a-902e-000091874005\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.333224 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-utilities\") pod \"fc2914f1-50b7-4a3a-902e-000091874005\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.333276 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-catalog-content\") pod \"fc2914f1-50b7-4a3a-902e-000091874005\" (UID: \"fc2914f1-50b7-4a3a-902e-000091874005\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.333573 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.333596 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9kq9\" (UniqueName: \"kubernetes.io/projected/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-kube-api-access-x9kq9\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.333611 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09f28c0e-7133-4236-9614-fe2fe6b5e2e2-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.334163 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-utilities" (OuterVolumeSpecName: "utilities") pod "fc2914f1-50b7-4a3a-902e-000091874005" (UID: "fc2914f1-50b7-4a3a-902e-000091874005"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.336592 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2914f1-50b7-4a3a-902e-000091874005-kube-api-access-mt6gp" (OuterVolumeSpecName: "kube-api-access-mt6gp") pod "fc2914f1-50b7-4a3a-902e-000091874005" (UID: "fc2914f1-50b7-4a3a-902e-000091874005"). InnerVolumeSpecName "kube-api-access-mt6gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.355430 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc2914f1-50b7-4a3a-902e-000091874005" (UID: "fc2914f1-50b7-4a3a-902e-000091874005"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.434474 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt6gp\" (UniqueName: \"kubernetes.io/projected/fc2914f1-50b7-4a3a-902e-000091874005-kube-api-access-mt6gp\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.434509 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.434519 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2914f1-50b7-4a3a-902e-000091874005-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.573292 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.626508 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.652874 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.737878 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b529v\" (UniqueName: \"kubernetes.io/projected/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-kube-api-access-b529v\") pod \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.737918 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-operator-metrics\") pod \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.737952 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-catalog-content\") pod \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.738007 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-utilities\") pod \"cc8a9450-7e86-4194-962d-566fee4563df\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.738026 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt2qp\" (UniqueName: \"kubernetes.io/projected/cc8a9450-7e86-4194-962d-566fee4563df-kube-api-access-bt2qp\") pod \"cc8a9450-7e86-4194-962d-566fee4563df\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.738059 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-catalog-content\") pod \"cc8a9450-7e86-4194-962d-566fee4563df\" (UID: \"cc8a9450-7e86-4194-962d-566fee4563df\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.738110 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-utilities\") pod \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.738143 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-trusted-ca\") pod \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\" (UID: \"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.738182 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksx8q\" (UniqueName: \"kubernetes.io/projected/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-kube-api-access-ksx8q\") pod \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\" (UID: \"18aaaacf-fb8c-4ba8-ab03-b89ec705114b\") " Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.738870 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-utilities" (OuterVolumeSpecName: "utilities") pod "18aaaacf-fb8c-4ba8-ab03-b89ec705114b" (UID: "18aaaacf-fb8c-4ba8-ab03-b89ec705114b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.738916 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-utilities" (OuterVolumeSpecName: "utilities") pod "cc8a9450-7e86-4194-962d-566fee4563df" (UID: "cc8a9450-7e86-4194-962d-566fee4563df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.741576 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8a9450-7e86-4194-962d-566fee4563df-kube-api-access-bt2qp" (OuterVolumeSpecName: "kube-api-access-bt2qp") pod "cc8a9450-7e86-4194-962d-566fee4563df" (UID: "cc8a9450-7e86-4194-962d-566fee4563df"). InnerVolumeSpecName "kube-api-access-bt2qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.741927 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-kube-api-access-ksx8q" (OuterVolumeSpecName: "kube-api-access-ksx8q") pod "18aaaacf-fb8c-4ba8-ab03-b89ec705114b" (UID: "18aaaacf-fb8c-4ba8-ab03-b89ec705114b"). InnerVolumeSpecName "kube-api-access-ksx8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.744171 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" (UID: "5e200fe3-fcc4-4b69-9937-6a5ea6233cdf"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.746162 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" (UID: "5e200fe3-fcc4-4b69-9937-6a5ea6233cdf"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.746647 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-kube-api-access-b529v" (OuterVolumeSpecName: "kube-api-access-b529v") pod "5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" (UID: "5e200fe3-fcc4-4b69-9937-6a5ea6233cdf"). InnerVolumeSpecName "kube-api-access-b529v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.796837 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18aaaacf-fb8c-4ba8-ab03-b89ec705114b" (UID: "18aaaacf-fb8c-4ba8-ab03-b89ec705114b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.839787 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.839821 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.839835 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt2qp\" (UniqueName: \"kubernetes.io/projected/cc8a9450-7e86-4194-962d-566fee4563df-kube-api-access-bt2qp\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.839847 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.839860 4948 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.839871 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksx8q\" (UniqueName: \"kubernetes.io/projected/18aaaacf-fb8c-4ba8-ab03-b89ec705114b-kube-api-access-ksx8q\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.839882 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b529v\" (UniqueName: \"kubernetes.io/projected/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-kube-api-access-b529v\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.839897 4948 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.852566 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc8a9450-7e86-4194-962d-566fee4563df" (UID: "cc8a9450-7e86-4194-962d-566fee4563df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:30:56 crc kubenswrapper[4948]: I1204 17:30:56.941378 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc8a9450-7e86-4194-962d-566fee4563df-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.046267 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22gwb" event={"ID":"09f28c0e-7133-4236-9614-fe2fe6b5e2e2","Type":"ContainerDied","Data":"cc789c44efb9db216713a106b4a06dfb67783d3675ef658c83de7436f2ced254"} Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.046333 4948 scope.go:117] "RemoveContainer" containerID="c3eaa4af95baccf8f31eb1654650c547d663fbdc728faf99613a4463715487d7" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.046385 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22gwb" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.049202 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jw8ps" event={"ID":"fc2914f1-50b7-4a3a-902e-000091874005","Type":"ContainerDied","Data":"f52734e7864f36322f5a2d8d2e4a1fafcbbd32542e1566cee03af066a0cae5f7"} Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.049357 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jw8ps" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.052932 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" event={"ID":"5e200fe3-fcc4-4b69-9937-6a5ea6233cdf","Type":"ContainerDied","Data":"3328c8b8d41fdb661d5f2ad81e8372fe5908587cbbbb1709ca2e625ec692483d"} Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.052942 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hbqk5" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.057892 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77jch" event={"ID":"18aaaacf-fb8c-4ba8-ab03-b89ec705114b","Type":"ContainerDied","Data":"7679ee7c7516bd4f7e1ecea1a422715d324e4dcc0ed6a7744a8b07374902b3a2"} Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.058142 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77jch" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.063720 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l48pp" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.064214 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l48pp" event={"ID":"cc8a9450-7e86-4194-962d-566fee4563df","Type":"ContainerDied","Data":"9acbcc6ea022375a18ee40b1101cd0cbcca3eb2bf6cea64a6989e7dc08574ed5"} Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.064306 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.069377 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.075638 4948 scope.go:117] "RemoveContainer" containerID="9fb8cf56fea2f0a03dd3001535ece2773c621188fdb5c8ac5bbd4f0665e4adff" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.078792 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jw8ps"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.088747 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jw8ps"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.093316 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-22gwb"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.093347 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-22gwb"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.117081 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l48pp"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.129917 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l48pp"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.138636 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77jch"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.144870 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" podStartSLOduration=4.144844936 podStartE2EDuration="4.144844936s" podCreationTimestamp="2025-12-04 17:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:30:57.137691595 +0000 UTC m=+268.498766007" watchObservedRunningTime="2025-12-04 17:30:57.144844936 +0000 UTC m=+268.505919348" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.147835 4948 scope.go:117] "RemoveContainer" containerID="f67efc76a2eaf418e39efe02be8a46e657086e9af357b999596d15001218a3f9" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.147838 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-77jch"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.160614 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hbqk5"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.161877 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hbqk5"] Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.205948 4948 scope.go:117] "RemoveContainer" containerID="a898f9fce9fab0e97ceda44d90f84a5d3d5abc7ea3d7144b9d59aabfa934b25d" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.225416 4948 scope.go:117] "RemoveContainer" containerID="9c897f54758cd272f87bb5aebee8cf08f477e45354f3232262543de51f30246d" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.241662 4948 scope.go:117] "RemoveContainer" containerID="318fab065d04f84f9e82f28d4f88ab0e4e33b2baa91493b6477288e5566bcafa" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.275966 4948 scope.go:117] "RemoveContainer" containerID="9a3935c90d9b28eae221025b6929ffc89ee5b1e51b6801c8da2ecd6cf10b3db1" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.296749 4948 scope.go:117] "RemoveContainer" containerID="77d8a8f65056279c810b1a6c249fc35052efd8d27e3c16327f9fc6e70ecab2f6" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.307508 4948 scope.go:117] "RemoveContainer" containerID="07a5a4fc53f48cb9b534fc2cd8c2f6c124fd43215d54cc5378f696a355e6ba80" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.321304 4948 scope.go:117] "RemoveContainer" containerID="623543b0e9d5a3e0b65bb1aff96a204b52c3882e33b9a9ecd54ce146b78e74cf" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.342086 4948 scope.go:117] "RemoveContainer" containerID="ae16fa2c605d20dd73cc12faf0c919c48f07d7430c67acb9feb5c7674881d65a" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.360347 4948 scope.go:117] "RemoveContainer" containerID="734fdda00dc33c05fbc5c04eced0697d9bdaf0b4131ebc66140dd75e12913045" Dec 04 17:30:57 crc kubenswrapper[4948]: I1204 17:30:57.375358 4948 scope.go:117] "RemoveContainer" containerID="f3c7c51bc6b9731eccdf013115cdb196c6ce1d0b2a68096f6728213c63a84504" Dec 04 17:30:58 crc kubenswrapper[4948]: I1204 17:30:58.920525 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" path="/var/lib/kubelet/pods/09f28c0e-7133-4236-9614-fe2fe6b5e2e2/volumes" Dec 04 17:30:58 crc kubenswrapper[4948]: I1204 17:30:58.921373 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" path="/var/lib/kubelet/pods/18aaaacf-fb8c-4ba8-ab03-b89ec705114b/volumes" Dec 04 17:30:58 crc kubenswrapper[4948]: I1204 17:30:58.921928 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" path="/var/lib/kubelet/pods/5e200fe3-fcc4-4b69-9937-6a5ea6233cdf/volumes" Dec 04 17:30:58 crc kubenswrapper[4948]: I1204 17:30:58.922352 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc8a9450-7e86-4194-962d-566fee4563df" path="/var/lib/kubelet/pods/cc8a9450-7e86-4194-962d-566fee4563df/volumes" Dec 04 17:30:58 crc kubenswrapper[4948]: I1204 17:30:58.922866 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc2914f1-50b7-4a3a-902e-000091874005" path="/var/lib/kubelet/pods/fc2914f1-50b7-4a3a-902e-000091874005/volumes" Dec 04 17:31:09 crc kubenswrapper[4948]: I1204 17:31:09.165849 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj"] Dec 04 17:31:09 crc kubenswrapper[4948]: I1204 17:31:09.166918 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" podUID="f4f479a3-49bb-4ea2-a784-47f537a251d6" containerName="controller-manager" containerID="cri-o://2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee" gracePeriod=30 Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.787957 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.927105 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-client-ca\") pod \"f4f479a3-49bb-4ea2-a784-47f537a251d6\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.927220 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-config\") pod \"f4f479a3-49bb-4ea2-a784-47f537a251d6\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.927303 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92f2n\" (UniqueName: \"kubernetes.io/projected/f4f479a3-49bb-4ea2-a784-47f537a251d6-kube-api-access-92f2n\") pod \"f4f479a3-49bb-4ea2-a784-47f537a251d6\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.927355 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4f479a3-49bb-4ea2-a784-47f537a251d6-serving-cert\") pod \"f4f479a3-49bb-4ea2-a784-47f537a251d6\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.927395 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-proxy-ca-bundles\") pod \"f4f479a3-49bb-4ea2-a784-47f537a251d6\" (UID: \"f4f479a3-49bb-4ea2-a784-47f537a251d6\") " Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.928599 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-client-ca" (OuterVolumeSpecName: "client-ca") pod "f4f479a3-49bb-4ea2-a784-47f537a251d6" (UID: "f4f479a3-49bb-4ea2-a784-47f537a251d6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.928735 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-config" (OuterVolumeSpecName: "config") pod "f4f479a3-49bb-4ea2-a784-47f537a251d6" (UID: "f4f479a3-49bb-4ea2-a784-47f537a251d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.928839 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f4f479a3-49bb-4ea2-a784-47f537a251d6" (UID: "f4f479a3-49bb-4ea2-a784-47f537a251d6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.934884 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f479a3-49bb-4ea2-a784-47f537a251d6-kube-api-access-92f2n" (OuterVolumeSpecName: "kube-api-access-92f2n") pod "f4f479a3-49bb-4ea2-a784-47f537a251d6" (UID: "f4f479a3-49bb-4ea2-a784-47f537a251d6"). InnerVolumeSpecName "kube-api-access-92f2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:09.937416 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f479a3-49bb-4ea2-a784-47f537a251d6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f4f479a3-49bb-4ea2-a784-47f537a251d6" (UID: "f4f479a3-49bb-4ea2-a784-47f537a251d6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.028795 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92f2n\" (UniqueName: \"kubernetes.io/projected/f4f479a3-49bb-4ea2-a784-47f537a251d6-kube-api-access-92f2n\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.028829 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4f479a3-49bb-4ea2-a784-47f537a251d6-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.028841 4948 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.028852 4948 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.028864 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f479a3-49bb-4ea2-a784-47f537a251d6-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.160157 4948 generic.go:334] "Generic (PLEG): container finished" podID="f4f479a3-49bb-4ea2-a784-47f537a251d6" containerID="2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee" exitCode=0 Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.160210 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" event={"ID":"f4f479a3-49bb-4ea2-a784-47f537a251d6","Type":"ContainerDied","Data":"2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee"} Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.160258 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.160282 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj" event={"ID":"f4f479a3-49bb-4ea2-a784-47f537a251d6","Type":"ContainerDied","Data":"e35f7475de1179317510ace39597a896b159871f2bfd415e16733cb9c37a9a52"} Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.160311 4948 scope.go:117] "RemoveContainer" containerID="2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.178696 4948 scope.go:117] "RemoveContainer" containerID="2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee" Dec 04 17:31:10 crc kubenswrapper[4948]: E1204 17:31:10.179345 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee\": container with ID starting with 2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee not found: ID does not exist" containerID="2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.179379 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee"} err="failed to get container status \"2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee\": rpc error: code = NotFound desc = could not find container \"2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee\": container with ID starting with 2ddca4fa391557f308225024cea4dcbe584ef6bdcb07436f15a62b538035ddee not found: ID does not exist" Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.192525 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj"] Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.192599 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cdbdfd9bd-7fvbj"] Dec 04 17:31:10 crc kubenswrapper[4948]: I1204 17:31:10.921222 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f479a3-49bb-4ea2-a784-47f537a251d6" path="/var/lib/kubelet/pods/f4f479a3-49bb-4ea2-a784-47f537a251d6/volumes" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112373 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-767dff95c6-m7djk"] Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112565 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112579 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112591 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f479a3-49bb-4ea2-a784-47f537a251d6" containerName="controller-manager" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112599 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f479a3-49bb-4ea2-a784-47f537a251d6" containerName="controller-manager" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112608 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerName="extract-utilities" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112614 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerName="extract-utilities" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112625 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2914f1-50b7-4a3a-902e-000091874005" containerName="extract-content" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112630 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2914f1-50b7-4a3a-902e-000091874005" containerName="extract-content" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112638 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112645 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112654 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112660 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112668 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="extract-content" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112674 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="extract-content" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112682 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="extract-utilities" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112688 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="extract-utilities" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112695 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerName="extract-content" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112701 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerName="extract-content" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112707 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerName="extract-utilities" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112712 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerName="extract-utilities" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112719 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerName="extract-content" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112724 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerName="extract-content" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112734 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" containerName="marketplace-operator" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112739 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" containerName="marketplace-operator" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112748 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2914f1-50b7-4a3a-902e-000091874005" containerName="extract-utilities" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112754 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2914f1-50b7-4a3a-902e-000091874005" containerName="extract-utilities" Dec 04 17:31:11 crc kubenswrapper[4948]: E1204 17:31:11.112761 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2914f1-50b7-4a3a-902e-000091874005" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112766 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2914f1-50b7-4a3a-902e-000091874005" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112839 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f479a3-49bb-4ea2-a784-47f537a251d6" containerName="controller-manager" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112847 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f28c0e-7133-4236-9614-fe2fe6b5e2e2" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112855 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2914f1-50b7-4a3a-902e-000091874005" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112861 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e200fe3-fcc4-4b69-9937-6a5ea6233cdf" containerName="marketplace-operator" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112869 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc8a9450-7e86-4194-962d-566fee4563df" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.112880 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="18aaaacf-fb8c-4ba8-ab03-b89ec705114b" containerName="registry-server" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.113223 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.115064 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.115205 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.116160 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.116365 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.117072 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.117106 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.122206 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.122667 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-767dff95c6-m7djk"] Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.243376 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p25q\" (UniqueName: \"kubernetes.io/projected/f243828d-8489-4a2c-8917-a8bb92f9953a-kube-api-access-6p25q\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.243707 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f243828d-8489-4a2c-8917-a8bb92f9953a-serving-cert\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.243848 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-proxy-ca-bundles\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.243973 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-config\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.244112 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-client-ca\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.345375 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p25q\" (UniqueName: \"kubernetes.io/projected/f243828d-8489-4a2c-8917-a8bb92f9953a-kube-api-access-6p25q\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.345455 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f243828d-8489-4a2c-8917-a8bb92f9953a-serving-cert\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.345490 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-proxy-ca-bundles\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.345524 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-config\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.345556 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-client-ca\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.346791 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-proxy-ca-bundles\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.346809 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-client-ca\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.347445 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f243828d-8489-4a2c-8917-a8bb92f9953a-config\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.350347 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f243828d-8489-4a2c-8917-a8bb92f9953a-serving-cert\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.364574 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p25q\" (UniqueName: \"kubernetes.io/projected/f243828d-8489-4a2c-8917-a8bb92f9953a-kube-api-access-6p25q\") pod \"controller-manager-767dff95c6-m7djk\" (UID: \"f243828d-8489-4a2c-8917-a8bb92f9953a\") " pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.443759 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:11 crc kubenswrapper[4948]: I1204 17:31:11.838754 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-767dff95c6-m7djk"] Dec 04 17:31:11 crc kubenswrapper[4948]: W1204 17:31:11.843748 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf243828d_8489_4a2c_8917_a8bb92f9953a.slice/crio-28a339b6de8718db7a818cc38b7f27baf06d49b22fce9853e6c2ca5b375d7163 WatchSource:0}: Error finding container 28a339b6de8718db7a818cc38b7f27baf06d49b22fce9853e6c2ca5b375d7163: Status 404 returned error can't find the container with id 28a339b6de8718db7a818cc38b7f27baf06d49b22fce9853e6c2ca5b375d7163 Dec 04 17:31:12 crc kubenswrapper[4948]: I1204 17:31:12.171618 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" event={"ID":"f243828d-8489-4a2c-8917-a8bb92f9953a","Type":"ContainerStarted","Data":"8b678535ab5d8341dd3d537ddd1c47e5d0d06e41498d07b9f0c18d0de0ad11bf"} Dec 04 17:31:12 crc kubenswrapper[4948]: I1204 17:31:12.171665 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" event={"ID":"f243828d-8489-4a2c-8917-a8bb92f9953a","Type":"ContainerStarted","Data":"28a339b6de8718db7a818cc38b7f27baf06d49b22fce9853e6c2ca5b375d7163"} Dec 04 17:31:12 crc kubenswrapper[4948]: I1204 17:31:12.172628 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:12 crc kubenswrapper[4948]: I1204 17:31:12.178176 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" Dec 04 17:31:12 crc kubenswrapper[4948]: I1204 17:31:12.219872 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-767dff95c6-m7djk" podStartSLOduration=3.219853291 podStartE2EDuration="3.219853291s" podCreationTimestamp="2025-12-04 17:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:31:12.200946469 +0000 UTC m=+283.562020871" watchObservedRunningTime="2025-12-04 17:31:12.219853291 +0000 UTC m=+283.580927693" Dec 04 17:31:26 crc kubenswrapper[4948]: I1204 17:31:26.975635 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-klrrz"] Dec 04 17:31:26 crc kubenswrapper[4948]: I1204 17:31:26.977103 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:26 crc kubenswrapper[4948]: I1204 17:31:26.979289 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 17:31:26 crc kubenswrapper[4948]: I1204 17:31:26.987670 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-klrrz"] Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.080905 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-utilities\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.081101 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-catalog-content\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.081249 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slz8n\" (UniqueName: \"kubernetes.io/projected/a1ee2c3b-8e86-4667-a070-d63035fad5a8-kube-api-access-slz8n\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.182344 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slz8n\" (UniqueName: \"kubernetes.io/projected/a1ee2c3b-8e86-4667-a070-d63035fad5a8-kube-api-access-slz8n\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.182479 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-utilities\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.182515 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-catalog-content\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.183154 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-catalog-content\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.183489 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-utilities\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.222368 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slz8n\" (UniqueName: \"kubernetes.io/projected/a1ee2c3b-8e86-4667-a070-d63035fad5a8-kube-api-access-slz8n\") pod \"redhat-marketplace-klrrz\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.293236 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.376867 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m67vs"] Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.378902 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.382503 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.396793 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m67vs"] Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.485203 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-utilities\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.485333 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z948\" (UniqueName: \"kubernetes.io/projected/7321007a-5f13-450f-aefe-187f2f7fccce-kube-api-access-4z948\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.485369 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-catalog-content\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.586370 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-utilities\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.586654 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z948\" (UniqueName: \"kubernetes.io/projected/7321007a-5f13-450f-aefe-187f2f7fccce-kube-api-access-4z948\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.586690 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-catalog-content\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.587269 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-catalog-content\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.587271 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-utilities\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.625255 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z948\" (UniqueName: \"kubernetes.io/projected/7321007a-5f13-450f-aefe-187f2f7fccce-kube-api-access-4z948\") pod \"redhat-operators-m67vs\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.709698 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:27 crc kubenswrapper[4948]: I1204 17:31:27.740795 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-klrrz"] Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.049817 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2j6s8"] Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.050797 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.060642 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2j6s8"] Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.093995 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-bound-sa-token\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.094134 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0e2199-516f-4ab7-bec0-3314197b0308-trusted-ca\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.094183 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb0e2199-516f-4ab7-bec0-3314197b0308-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.094212 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb0e2199-516f-4ab7-bec0-3314197b0308-registry-certificates\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.094248 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-registry-tls\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.094293 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb0e2199-516f-4ab7-bec0-3314197b0308-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.094329 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.094349 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92cf\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-kube-api-access-n92cf\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.112398 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m67vs"] Dec 04 17:31:28 crc kubenswrapper[4948]: W1204 17:31:28.115253 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7321007a_5f13_450f_aefe_187f2f7fccce.slice/crio-06803ac3d665769db648dcdb9790d1b310a0165fc6379fb14605483f165582a5 WatchSource:0}: Error finding container 06803ac3d665769db648dcdb9790d1b310a0165fc6379fb14605483f165582a5: Status 404 returned error can't find the container with id 06803ac3d665769db648dcdb9790d1b310a0165fc6379fb14605483f165582a5 Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.127820 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.195594 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb0e2199-516f-4ab7-bec0-3314197b0308-registry-certificates\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.195646 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-registry-tls\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.195680 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb0e2199-516f-4ab7-bec0-3314197b0308-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.195706 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n92cf\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-kube-api-access-n92cf\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.195740 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-bound-sa-token\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.195765 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0e2199-516f-4ab7-bec0-3314197b0308-trusted-ca\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.195795 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb0e2199-516f-4ab7-bec0-3314197b0308-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.196792 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0e2199-516f-4ab7-bec0-3314197b0308-trusted-ca\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.196923 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb0e2199-516f-4ab7-bec0-3314197b0308-registry-certificates\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.197183 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb0e2199-516f-4ab7-bec0-3314197b0308-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.201181 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb0e2199-516f-4ab7-bec0-3314197b0308-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.201297 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-registry-tls\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.210027 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-bound-sa-token\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.210335 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92cf\" (UniqueName: \"kubernetes.io/projected/eb0e2199-516f-4ab7-bec0-3314197b0308-kube-api-access-n92cf\") pod \"image-registry-66df7c8f76-2j6s8\" (UID: \"eb0e2199-516f-4ab7-bec0-3314197b0308\") " pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.253800 4948 generic.go:334] "Generic (PLEG): container finished" podID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerID="415d35e6292c2f70d118361b3f2350809c3b22d3d98edf56282948d679494bd0" exitCode=0 Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.253847 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klrrz" event={"ID":"a1ee2c3b-8e86-4667-a070-d63035fad5a8","Type":"ContainerDied","Data":"415d35e6292c2f70d118361b3f2350809c3b22d3d98edf56282948d679494bd0"} Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.253886 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klrrz" event={"ID":"a1ee2c3b-8e86-4667-a070-d63035fad5a8","Type":"ContainerStarted","Data":"022d3cab728ec42696bc71e8c06ea4c3df064bc6bf7f8d33570df143f9c92125"} Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.254941 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m67vs" event={"ID":"7321007a-5f13-450f-aefe-187f2f7fccce","Type":"ContainerStarted","Data":"06803ac3d665769db648dcdb9790d1b310a0165fc6379fb14605483f165582a5"} Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.407698 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.793423 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2j6s8"] Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.968349 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b8mwr"] Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.969784 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.971605 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 17:31:28 crc kubenswrapper[4948]: I1204 17:31:28.978164 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8mwr"] Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.046289 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t2bl\" (UniqueName: \"kubernetes.io/projected/1c25d318-4040-48ac-89b1-473380694ed3-kube-api-access-8t2bl\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.046358 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-utilities\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.046446 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-catalog-content\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.147474 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-catalog-content\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.147532 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t2bl\" (UniqueName: \"kubernetes.io/projected/1c25d318-4040-48ac-89b1-473380694ed3-kube-api-access-8t2bl\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.147580 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-utilities\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.148143 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-catalog-content\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.148143 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-utilities\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.168431 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t2bl\" (UniqueName: \"kubernetes.io/projected/1c25d318-4040-48ac-89b1-473380694ed3-kube-api-access-8t2bl\") pod \"certified-operators-b8mwr\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.261153 4948 generic.go:334] "Generic (PLEG): container finished" podID="7321007a-5f13-450f-aefe-187f2f7fccce" containerID="b0321610bcb85b6233ed60b8f6775010af8d0801ffd5d7db493820f3b3ec7024" exitCode=0 Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.261256 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m67vs" event={"ID":"7321007a-5f13-450f-aefe-187f2f7fccce","Type":"ContainerDied","Data":"b0321610bcb85b6233ed60b8f6775010af8d0801ffd5d7db493820f3b3ec7024"} Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.263644 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" event={"ID":"eb0e2199-516f-4ab7-bec0-3314197b0308","Type":"ContainerStarted","Data":"78abf803cc8b9e0fde601bb1e15b0ee613a84d35a56b205bcf9d3598e6ecc997"} Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.282435 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.670639 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8mwr"] Dec 04 17:31:29 crc kubenswrapper[4948]: W1204 17:31:29.677103 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c25d318_4040_48ac_89b1_473380694ed3.slice/crio-e363ac507e017eb6bb42c83acc2c20d5d417bd4faf9dfecedc421717c08bd0d1 WatchSource:0}: Error finding container e363ac507e017eb6bb42c83acc2c20d5d417bd4faf9dfecedc421717c08bd0d1: Status 404 returned error can't find the container with id e363ac507e017eb6bb42c83acc2c20d5d417bd4faf9dfecedc421717c08bd0d1 Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.972527 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xgkmz"] Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.974156 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.981899 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 17:31:29 crc kubenswrapper[4948]: I1204 17:31:29.983655 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgkmz"] Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.056971 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-utilities\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.057023 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26bfm\" (UniqueName: \"kubernetes.io/projected/a4a41f24-a106-4070-8656-9344de9df965-kube-api-access-26bfm\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.057092 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-catalog-content\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.159295 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-catalog-content\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.158664 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-catalog-content\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.159419 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-utilities\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.159766 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-utilities\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.159842 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26bfm\" (UniqueName: \"kubernetes.io/projected/a4a41f24-a106-4070-8656-9344de9df965-kube-api-access-26bfm\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.191910 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26bfm\" (UniqueName: \"kubernetes.io/projected/a4a41f24-a106-4070-8656-9344de9df965-kube-api-access-26bfm\") pod \"community-operators-xgkmz\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.270307 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8mwr" event={"ID":"1c25d318-4040-48ac-89b1-473380694ed3","Type":"ContainerStarted","Data":"e363ac507e017eb6bb42c83acc2c20d5d417bd4faf9dfecedc421717c08bd0d1"} Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.272024 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" event={"ID":"eb0e2199-516f-4ab7-bec0-3314197b0308","Type":"ContainerStarted","Data":"e98918fb21c53af39efa06e660a8f221b77c3732e8c9b1fff1919b231c206b1e"} Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.290915 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:31:30 crc kubenswrapper[4948]: I1204 17:31:30.720462 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgkmz"] Dec 04 17:31:30 crc kubenswrapper[4948]: W1204 17:31:30.768806 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a41f24_a106_4070_8656_9344de9df965.slice/crio-7e6a2753f8859b49546a026df9225ac7a141aa64794794a45efe0157079cf1f6 WatchSource:0}: Error finding container 7e6a2753f8859b49546a026df9225ac7a141aa64794794a45efe0157079cf1f6: Status 404 returned error can't find the container with id 7e6a2753f8859b49546a026df9225ac7a141aa64794794a45efe0157079cf1f6 Dec 04 17:31:31 crc kubenswrapper[4948]: I1204 17:31:31.278459 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkmz" event={"ID":"a4a41f24-a106-4070-8656-9344de9df965","Type":"ContainerStarted","Data":"398d0f69f53254f3b3e634764278555a9c979d8b0a1c0b01e468ee8a7f2b5fa7"} Dec 04 17:31:31 crc kubenswrapper[4948]: I1204 17:31:31.278504 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkmz" event={"ID":"a4a41f24-a106-4070-8656-9344de9df965","Type":"ContainerStarted","Data":"7e6a2753f8859b49546a026df9225ac7a141aa64794794a45efe0157079cf1f6"} Dec 04 17:31:31 crc kubenswrapper[4948]: I1204 17:31:31.281488 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klrrz" event={"ID":"a1ee2c3b-8e86-4667-a070-d63035fad5a8","Type":"ContainerStarted","Data":"8b5540ebde591cf265a86997f3e970c8379511ebaae01757bb6cc8a882ed42e9"} Dec 04 17:31:31 crc kubenswrapper[4948]: I1204 17:31:31.282832 4948 generic.go:334] "Generic (PLEG): container finished" podID="1c25d318-4040-48ac-89b1-473380694ed3" containerID="59b67f5c18c4d670f05dedfe0c024d0de4f1a31063fc40c8a343a9fd1cf656ac" exitCode=0 Dec 04 17:31:31 crc kubenswrapper[4948]: I1204 17:31:31.283767 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8mwr" event={"ID":"1c25d318-4040-48ac-89b1-473380694ed3","Type":"ContainerDied","Data":"59b67f5c18c4d670f05dedfe0c024d0de4f1a31063fc40c8a343a9fd1cf656ac"} Dec 04 17:31:31 crc kubenswrapper[4948]: I1204 17:31:31.283797 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:31 crc kubenswrapper[4948]: I1204 17:31:31.332343 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" podStartSLOduration=3.332321488 podStartE2EDuration="3.332321488s" podCreationTimestamp="2025-12-04 17:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:31:31.33061675 +0000 UTC m=+302.691691152" watchObservedRunningTime="2025-12-04 17:31:31.332321488 +0000 UTC m=+302.693395890" Dec 04 17:31:32 crc kubenswrapper[4948]: I1204 17:31:32.288495 4948 generic.go:334] "Generic (PLEG): container finished" podID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerID="8b5540ebde591cf265a86997f3e970c8379511ebaae01757bb6cc8a882ed42e9" exitCode=0 Dec 04 17:31:32 crc kubenswrapper[4948]: I1204 17:31:32.289548 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klrrz" event={"ID":"a1ee2c3b-8e86-4667-a070-d63035fad5a8","Type":"ContainerDied","Data":"8b5540ebde591cf265a86997f3e970c8379511ebaae01757bb6cc8a882ed42e9"} Dec 04 17:31:32 crc kubenswrapper[4948]: I1204 17:31:32.297219 4948 generic.go:334] "Generic (PLEG): container finished" podID="a4a41f24-a106-4070-8656-9344de9df965" containerID="398d0f69f53254f3b3e634764278555a9c979d8b0a1c0b01e468ee8a7f2b5fa7" exitCode=0 Dec 04 17:31:32 crc kubenswrapper[4948]: I1204 17:31:32.297292 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkmz" event={"ID":"a4a41f24-a106-4070-8656-9344de9df965","Type":"ContainerDied","Data":"398d0f69f53254f3b3e634764278555a9c979d8b0a1c0b01e468ee8a7f2b5fa7"} Dec 04 17:31:32 crc kubenswrapper[4948]: I1204 17:31:32.300919 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m67vs" event={"ID":"7321007a-5f13-450f-aefe-187f2f7fccce","Type":"ContainerStarted","Data":"24fbf9343f0215f56669f02b70ae4cd0e19d904521f9bf3bfa07b0f0effb1707"} Dec 04 17:31:33 crc kubenswrapper[4948]: I1204 17:31:33.313386 4948 generic.go:334] "Generic (PLEG): container finished" podID="7321007a-5f13-450f-aefe-187f2f7fccce" containerID="24fbf9343f0215f56669f02b70ae4cd0e19d904521f9bf3bfa07b0f0effb1707" exitCode=0 Dec 04 17:31:33 crc kubenswrapper[4948]: I1204 17:31:33.313445 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m67vs" event={"ID":"7321007a-5f13-450f-aefe-187f2f7fccce","Type":"ContainerDied","Data":"24fbf9343f0215f56669f02b70ae4cd0e19d904521f9bf3bfa07b0f0effb1707"} Dec 04 17:31:34 crc kubenswrapper[4948]: I1204 17:31:34.321642 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klrrz" event={"ID":"a1ee2c3b-8e86-4667-a070-d63035fad5a8","Type":"ContainerStarted","Data":"2e95457a9abf3231097f156ef500f258026b5f0d3e7e85d93aa573585040bd91"} Dec 04 17:31:34 crc kubenswrapper[4948]: I1204 17:31:34.352824 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-klrrz" podStartSLOduration=2.74851775 podStartE2EDuration="8.352776901s" podCreationTimestamp="2025-12-04 17:31:26 +0000 UTC" firstStartedPulling="2025-12-04 17:31:28.255251693 +0000 UTC m=+299.616326095" lastFinishedPulling="2025-12-04 17:31:33.859510844 +0000 UTC m=+305.220585246" observedRunningTime="2025-12-04 17:31:34.344386795 +0000 UTC m=+305.705461197" watchObservedRunningTime="2025-12-04 17:31:34.352776901 +0000 UTC m=+305.713851303" Dec 04 17:31:37 crc kubenswrapper[4948]: I1204 17:31:37.294434 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:37 crc kubenswrapper[4948]: I1204 17:31:37.295736 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:37 crc kubenswrapper[4948]: I1204 17:31:37.348003 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:38 crc kubenswrapper[4948]: I1204 17:31:38.408675 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:31:48 crc kubenswrapper[4948]: I1204 17:31:48.416086 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2j6s8" Dec 04 17:31:48 crc kubenswrapper[4948]: I1204 17:31:48.466885 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g7mvh"] Dec 04 17:31:49 crc kubenswrapper[4948]: I1204 17:31:49.185976 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb"] Dec 04 17:31:49 crc kubenswrapper[4948]: I1204 17:31:49.186214 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" podUID="bcf79359-9ae9-422d-9b67-8dc7bba891f1" containerName="route-controller-manager" containerID="cri-o://bb8aec7d9929b01ab7d19192ce372f31f424d8ec3e2546f4c64596094fbf66e4" gracePeriod=30 Dec 04 17:31:51 crc kubenswrapper[4948]: I1204 17:31:51.440287 4948 patch_prober.go:28] interesting pod/route-controller-manager-78cb5fd749-8fhwb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Dec 04 17:31:51 crc kubenswrapper[4948]: I1204 17:31:51.440781 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" podUID="bcf79359-9ae9-422d-9b67-8dc7bba891f1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.458195 4948 generic.go:334] "Generic (PLEG): container finished" podID="bcf79359-9ae9-422d-9b67-8dc7bba891f1" containerID="bb8aec7d9929b01ab7d19192ce372f31f424d8ec3e2546f4c64596094fbf66e4" exitCode=0 Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.458476 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" event={"ID":"bcf79359-9ae9-422d-9b67-8dc7bba891f1","Type":"ContainerDied","Data":"bb8aec7d9929b01ab7d19192ce372f31f424d8ec3e2546f4c64596094fbf66e4"} Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.741611 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.776995 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k"] Dec 04 17:31:54 crc kubenswrapper[4948]: E1204 17:31:54.777224 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf79359-9ae9-422d-9b67-8dc7bba891f1" containerName="route-controller-manager" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.777236 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf79359-9ae9-422d-9b67-8dc7bba891f1" containerName="route-controller-manager" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.777333 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf79359-9ae9-422d-9b67-8dc7bba891f1" containerName="route-controller-manager" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.777759 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.797723 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k"] Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.859403 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-config\") pod \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.859478 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4475\" (UniqueName: \"kubernetes.io/projected/bcf79359-9ae9-422d-9b67-8dc7bba891f1-kube-api-access-m4475\") pod \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.859502 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-client-ca\") pod \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.859561 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf79359-9ae9-422d-9b67-8dc7bba891f1-serving-cert\") pod \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\" (UID: \"bcf79359-9ae9-422d-9b67-8dc7bba891f1\") " Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.859744 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcwwp\" (UniqueName: \"kubernetes.io/projected/4f56609c-4334-4cce-8391-a245b6dea97f-kube-api-access-wcwwp\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.859769 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f56609c-4334-4cce-8391-a245b6dea97f-client-ca\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.859795 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f56609c-4334-4cce-8391-a245b6dea97f-serving-cert\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.859834 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f56609c-4334-4cce-8391-a245b6dea97f-config\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.860356 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-config" (OuterVolumeSpecName: "config") pod "bcf79359-9ae9-422d-9b67-8dc7bba891f1" (UID: "bcf79359-9ae9-422d-9b67-8dc7bba891f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.860438 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "bcf79359-9ae9-422d-9b67-8dc7bba891f1" (UID: "bcf79359-9ae9-422d-9b67-8dc7bba891f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.864464 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf79359-9ae9-422d-9b67-8dc7bba891f1-kube-api-access-m4475" (OuterVolumeSpecName: "kube-api-access-m4475") pod "bcf79359-9ae9-422d-9b67-8dc7bba891f1" (UID: "bcf79359-9ae9-422d-9b67-8dc7bba891f1"). InnerVolumeSpecName "kube-api-access-m4475". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.864935 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf79359-9ae9-422d-9b67-8dc7bba891f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bcf79359-9ae9-422d-9b67-8dc7bba891f1" (UID: "bcf79359-9ae9-422d-9b67-8dc7bba891f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.960692 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcwwp\" (UniqueName: \"kubernetes.io/projected/4f56609c-4334-4cce-8391-a245b6dea97f-kube-api-access-wcwwp\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.960744 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f56609c-4334-4cce-8391-a245b6dea97f-client-ca\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.960772 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f56609c-4334-4cce-8391-a245b6dea97f-serving-cert\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.960813 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f56609c-4334-4cce-8391-a245b6dea97f-config\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.960880 4948 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf79359-9ae9-422d-9b67-8dc7bba891f1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.960891 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.960903 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4475\" (UniqueName: \"kubernetes.io/projected/bcf79359-9ae9-422d-9b67-8dc7bba891f1-kube-api-access-m4475\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.960913 4948 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bcf79359-9ae9-422d-9b67-8dc7bba891f1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.962090 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f56609c-4334-4cce-8391-a245b6dea97f-config\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.963117 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f56609c-4334-4cce-8391-a245b6dea97f-client-ca\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.966501 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f56609c-4334-4cce-8391-a245b6dea97f-serving-cert\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:54 crc kubenswrapper[4948]: I1204 17:31:54.979788 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcwwp\" (UniqueName: \"kubernetes.io/projected/4f56609c-4334-4cce-8391-a245b6dea97f-kube-api-access-wcwwp\") pod \"route-controller-manager-5c9c47fff6-58w7k\" (UID: \"4f56609c-4334-4cce-8391-a245b6dea97f\") " pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.091685 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.467757 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m67vs" event={"ID":"7321007a-5f13-450f-aefe-187f2f7fccce","Type":"ContainerStarted","Data":"77191cd3e5654d54a6ecaf032bc2061e548410ddca0b9980132e7cc1b111acb0"} Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.469507 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.469619 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb" event={"ID":"bcf79359-9ae9-422d-9b67-8dc7bba891f1","Type":"ContainerDied","Data":"9198d6730780a50cc5b441ae9ea6c56852693b60c351f421175b39c7c0fd6017"} Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.469723 4948 scope.go:117] "RemoveContainer" containerID="bb8aec7d9929b01ab7d19192ce372f31f424d8ec3e2546f4c64596094fbf66e4" Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.473016 4948 generic.go:334] "Generic (PLEG): container finished" podID="1c25d318-4040-48ac-89b1-473380694ed3" containerID="5debea8774c94fd40c94bc62e49b9b5cf084a570940d077f79aa7e6ecfa3005c" exitCode=0 Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.473114 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8mwr" event={"ID":"1c25d318-4040-48ac-89b1-473380694ed3","Type":"ContainerDied","Data":"5debea8774c94fd40c94bc62e49b9b5cf084a570940d077f79aa7e6ecfa3005c"} Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.479913 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k"] Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.481674 4948 generic.go:334] "Generic (PLEG): container finished" podID="a4a41f24-a106-4070-8656-9344de9df965" containerID="d50cfd819f02966d3447e2cc88676ba40a4a48b9bd42e768d555e2c50a3ac8f7" exitCode=0 Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.481729 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkmz" event={"ID":"a4a41f24-a106-4070-8656-9344de9df965","Type":"ContainerDied","Data":"d50cfd819f02966d3447e2cc88676ba40a4a48b9bd42e768d555e2c50a3ac8f7"} Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.499026 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m67vs" podStartSLOduration=3.7937944420000003 podStartE2EDuration="28.499006816s" podCreationTimestamp="2025-12-04 17:31:27 +0000 UTC" firstStartedPulling="2025-12-04 17:31:29.263248201 +0000 UTC m=+300.624322623" lastFinishedPulling="2025-12-04 17:31:53.968460555 +0000 UTC m=+325.329534997" observedRunningTime="2025-12-04 17:31:55.494406962 +0000 UTC m=+326.855481374" watchObservedRunningTime="2025-12-04 17:31:55.499006816 +0000 UTC m=+326.860081218" Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.512143 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb"] Dec 04 17:31:55 crc kubenswrapper[4948]: I1204 17:31:55.516109 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78cb5fd749-8fhwb"] Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.488193 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" event={"ID":"4f56609c-4334-4cce-8391-a245b6dea97f","Type":"ContainerStarted","Data":"81dd7636190f9f093cd34546c1c51da331e6ba94727434a40f5763f2ad08c48f"} Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.488456 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" event={"ID":"4f56609c-4334-4cce-8391-a245b6dea97f","Type":"ContainerStarted","Data":"1f17f6c96fec7c48a29103e4ac887ba3978114697ffccc4132e40156b8f1d541"} Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.488474 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.492624 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8mwr" event={"ID":"1c25d318-4040-48ac-89b1-473380694ed3","Type":"ContainerStarted","Data":"89108754633e4e4f5c9ff509dd230d63bc42aa9d84ef848c5b23ca0db3409b2c"} Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.495227 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkmz" event={"ID":"a4a41f24-a106-4070-8656-9344de9df965","Type":"ContainerStarted","Data":"70ac6c7590551d7b66a364b208ee2b9c1c4aace27e5f4a20cd395be26dd58306"} Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.495732 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.512497 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c9c47fff6-58w7k" podStartSLOduration=7.512476664 podStartE2EDuration="7.512476664s" podCreationTimestamp="2025-12-04 17:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:31:56.508897358 +0000 UTC m=+327.869971770" watchObservedRunningTime="2025-12-04 17:31:56.512476664 +0000 UTC m=+327.873551066" Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.532189 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b8mwr" podStartSLOduration=3.935412205 podStartE2EDuration="28.532164667s" podCreationTimestamp="2025-12-04 17:31:28 +0000 UTC" firstStartedPulling="2025-12-04 17:31:31.307832819 +0000 UTC m=+302.668907221" lastFinishedPulling="2025-12-04 17:31:55.904585271 +0000 UTC m=+327.265659683" observedRunningTime="2025-12-04 17:31:56.526638767 +0000 UTC m=+327.887713179" watchObservedRunningTime="2025-12-04 17:31:56.532164667 +0000 UTC m=+327.893239069" Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.564325 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xgkmz" podStartSLOduration=3.984443993 podStartE2EDuration="27.564298815s" podCreationTimestamp="2025-12-04 17:31:29 +0000 UTC" firstStartedPulling="2025-12-04 17:31:32.298669294 +0000 UTC m=+303.659743706" lastFinishedPulling="2025-12-04 17:31:55.878524126 +0000 UTC m=+327.239598528" observedRunningTime="2025-12-04 17:31:56.552661501 +0000 UTC m=+327.913735913" watchObservedRunningTime="2025-12-04 17:31:56.564298815 +0000 UTC m=+327.925373217" Dec 04 17:31:56 crc kubenswrapper[4948]: I1204 17:31:56.921454 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf79359-9ae9-422d-9b67-8dc7bba891f1" path="/var/lib/kubelet/pods/bcf79359-9ae9-422d-9b67-8dc7bba891f1/volumes" Dec 04 17:31:57 crc kubenswrapper[4948]: I1204 17:31:57.710232 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:57 crc kubenswrapper[4948]: I1204 17:31:57.710290 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:31:58 crc kubenswrapper[4948]: I1204 17:31:58.755805 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m67vs" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="registry-server" probeResult="failure" output=< Dec 04 17:31:58 crc kubenswrapper[4948]: timeout: failed to connect service ":50051" within 1s Dec 04 17:31:58 crc kubenswrapper[4948]: > Dec 04 17:31:59 crc kubenswrapper[4948]: I1204 17:31:59.282794 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:59 crc kubenswrapper[4948]: I1204 17:31:59.282867 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:31:59 crc kubenswrapper[4948]: I1204 17:31:59.321695 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:32:00 crc kubenswrapper[4948]: I1204 17:32:00.292078 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:32:00 crc kubenswrapper[4948]: I1204 17:32:00.292410 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:32:00 crc kubenswrapper[4948]: I1204 17:32:00.342495 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:32:01 crc kubenswrapper[4948]: I1204 17:32:01.603744 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:32:07 crc kubenswrapper[4948]: I1204 17:32:07.742663 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:32:07 crc kubenswrapper[4948]: I1204 17:32:07.784692 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:32:09 crc kubenswrapper[4948]: I1204 17:32:09.329007 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:32:13 crc kubenswrapper[4948]: I1204 17:32:13.567539 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" podUID="0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" containerName="registry" containerID="cri-o://c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e" gracePeriod=30 Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.485112 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.599583 4948 generic.go:334] "Generic (PLEG): container finished" podID="0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" containerID="c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e" exitCode=0 Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.599649 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.599644 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" event={"ID":"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863","Type":"ContainerDied","Data":"c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e"} Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.599909 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g7mvh" event={"ID":"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863","Type":"ContainerDied","Data":"df842ebc971179dc5117715d5ee59142ae2db975ce699ff54d688a730381f8a7"} Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.599944 4948 scope.go:117] "RemoveContainer" containerID="c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.621588 4948 scope.go:117] "RemoveContainer" containerID="c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e" Dec 04 17:32:14 crc kubenswrapper[4948]: E1204 17:32:14.622034 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e\": container with ID starting with c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e not found: ID does not exist" containerID="c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.622091 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e"} err="failed to get container status \"c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e\": rpc error: code = NotFound desc = could not find container \"c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e\": container with ID starting with c29b598314023df9574eb5d0108309b63ce867f195a9d3b0417d5ffef3f59f4e not found: ID does not exist" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.692819 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-ca-trust-extracted\") pod \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.692880 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-bound-sa-token\") pod \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.692915 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-tls\") pod \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.692944 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-installation-pull-secrets\") pod \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.693121 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.693188 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j4pg\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-kube-api-access-2j4pg\") pod \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.693230 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-trusted-ca\") pod \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.693288 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-certificates\") pod \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\" (UID: \"0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863\") " Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.693910 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.694064 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.698810 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.698891 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.699206 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.699998 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-kube-api-access-2j4pg" (OuterVolumeSpecName: "kube-api-access-2j4pg") pod "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863"). InnerVolumeSpecName "kube-api-access-2j4pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.706413 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.710263 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" (UID: "0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.795179 4948 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.795218 4948 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.795231 4948 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.795244 4948 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.795256 4948 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.795267 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j4pg\" (UniqueName: \"kubernetes.io/projected/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-kube-api-access-2j4pg\") on node \"crc\" DevicePath \"\"" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.795280 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.947580 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g7mvh"] Dec 04 17:32:14 crc kubenswrapper[4948]: I1204 17:32:14.953469 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g7mvh"] Dec 04 17:32:16 crc kubenswrapper[4948]: I1204 17:32:16.927838 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" path="/var/lib/kubelet/pods/0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863/volumes" Dec 04 17:32:40 crc kubenswrapper[4948]: I1204 17:32:40.625630 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:32:40 crc kubenswrapper[4948]: I1204 17:32:40.626241 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:33:10 crc kubenswrapper[4948]: I1204 17:33:10.625580 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:33:10 crc kubenswrapper[4948]: I1204 17:33:10.626139 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:33:29 crc kubenswrapper[4948]: I1204 17:33:29.288365 4948 scope.go:117] "RemoveContainer" containerID="813548feb85ed86684be112b00d9e592abdc413274bf21d3e2532a759e46104b" Dec 04 17:33:40 crc kubenswrapper[4948]: I1204 17:33:40.625868 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:33:40 crc kubenswrapper[4948]: I1204 17:33:40.626815 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:33:40 crc kubenswrapper[4948]: I1204 17:33:40.626906 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:33:40 crc kubenswrapper[4948]: I1204 17:33:40.628186 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1115719e47aa6bf1c1453a2c9bdd06db75016c207034cc5d723bbc4c3177a31"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:33:40 crc kubenswrapper[4948]: I1204 17:33:40.628318 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://d1115719e47aa6bf1c1453a2c9bdd06db75016c207034cc5d723bbc4c3177a31" gracePeriod=600 Dec 04 17:33:41 crc kubenswrapper[4948]: I1204 17:33:41.123758 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="d1115719e47aa6bf1c1453a2c9bdd06db75016c207034cc5d723bbc4c3177a31" exitCode=0 Dec 04 17:33:41 crc kubenswrapper[4948]: I1204 17:33:41.123895 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"d1115719e47aa6bf1c1453a2c9bdd06db75016c207034cc5d723bbc4c3177a31"} Dec 04 17:33:41 crc kubenswrapper[4948]: I1204 17:33:41.124535 4948 scope.go:117] "RemoveContainer" containerID="8e811b8000b0a1451742559953ae4b8ceaef08af55bb4663a9967a43362e5d3b" Dec 04 17:33:42 crc kubenswrapper[4948]: I1204 17:33:42.131966 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"f3c6c114e8dd7ff1e3b5d99fe1de39a49ef32c8d02345e74e4c0478fcc3c5397"} Dec 04 17:36:10 crc kubenswrapper[4948]: I1204 17:36:10.625586 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:36:10 crc kubenswrapper[4948]: I1204 17:36:10.626245 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:36:40 crc kubenswrapper[4948]: I1204 17:36:40.625773 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:36:40 crc kubenswrapper[4948]: I1204 17:36:40.626426 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:37:10 crc kubenswrapper[4948]: I1204 17:37:10.625340 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:37:10 crc kubenswrapper[4948]: I1204 17:37:10.626136 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:37:10 crc kubenswrapper[4948]: I1204 17:37:10.626216 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:37:10 crc kubenswrapper[4948]: I1204 17:37:10.627301 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3c6c114e8dd7ff1e3b5d99fe1de39a49ef32c8d02345e74e4c0478fcc3c5397"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:37:10 crc kubenswrapper[4948]: I1204 17:37:10.627409 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://f3c6c114e8dd7ff1e3b5d99fe1de39a49ef32c8d02345e74e4c0478fcc3c5397" gracePeriod=600 Dec 04 17:37:11 crc kubenswrapper[4948]: I1204 17:37:11.308291 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="f3c6c114e8dd7ff1e3b5d99fe1de39a49ef32c8d02345e74e4c0478fcc3c5397" exitCode=0 Dec 04 17:37:11 crc kubenswrapper[4948]: I1204 17:37:11.308373 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"f3c6c114e8dd7ff1e3b5d99fe1de39a49ef32c8d02345e74e4c0478fcc3c5397"} Dec 04 17:37:11 crc kubenswrapper[4948]: I1204 17:37:11.308746 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"6308f9cf478332f1e14942b84d9243079dd40a48776f7e33bd8faea91d259d32"} Dec 04 17:37:11 crc kubenswrapper[4948]: I1204 17:37:11.308769 4948 scope.go:117] "RemoveContainer" containerID="d1115719e47aa6bf1c1453a2c9bdd06db75016c207034cc5d723bbc4c3177a31" Dec 04 17:37:33 crc kubenswrapper[4948]: I1204 17:37:33.490956 4948 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 17:39:10 crc kubenswrapper[4948]: I1204 17:39:10.624994 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:39:10 crc kubenswrapper[4948]: I1204 17:39:10.625650 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:39:40 crc kubenswrapper[4948]: I1204 17:39:40.625359 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:39:40 crc kubenswrapper[4948]: I1204 17:39:40.626205 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.634610 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4cnmm"] Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.640185 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovn-controller" containerID="cri-o://fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa" gracePeriod=30 Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.640308 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="nbdb" containerID="cri-o://664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" gracePeriod=30 Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.640609 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="sbdb" containerID="cri-o://292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" gracePeriod=30 Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.640637 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" gracePeriod=30 Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.640660 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="northd" containerID="cri-o://73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" gracePeriod=30 Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.640683 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kube-rbac-proxy-node" containerID="cri-o://25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" gracePeriod=30 Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.640705 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovn-acl-logging" containerID="cri-o://5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900" gracePeriod=30 Dec 04 17:40:00 crc kubenswrapper[4948]: I1204 17:40:00.733709 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovnkube-controller" containerID="cri-o://9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" gracePeriod=30 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.039137 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4cnmm_8149892b-eb94-4d2d-99b3-cebf34efa32a/ovn-acl-logging/0.log" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.040096 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4cnmm_8149892b-eb94-4d2d-99b3-cebf34efa32a/ovn-controller/0.log" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.040803 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081491 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-var-lib-openvswitch\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081540 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-netns\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081561 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-node-log\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081605 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-script-lib\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081621 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-etc-openvswitch\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081659 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-env-overrides\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081689 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-systemd-units\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081730 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-bin\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081750 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-config\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081768 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-ovn-kubernetes\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081788 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-kubelet\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081807 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mph6j\" (UniqueName: \"kubernetes.io/projected/8149892b-eb94-4d2d-99b3-cebf34efa32a-kube-api-access-mph6j\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081848 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081886 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-systemd\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081910 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-netd\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081935 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovn-node-metrics-cert\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.081976 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-ovn\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082002 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-openvswitch\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082022 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-log-socket\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082057 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-slash\") pod \"8149892b-eb94-4d2d-99b3-cebf34efa32a\" (UID: \"8149892b-eb94-4d2d-99b3-cebf34efa32a\") " Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082022 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082185 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082236 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082281 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-node-log" (OuterVolumeSpecName: "node-log") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082296 4948 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082312 4948 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082321 4948 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082410 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082867 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082973 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082983 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.082971 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.083112 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-log-socket" (OuterVolumeSpecName: "log-socket") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.083125 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.083201 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.083230 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.083273 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.083293 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.084950 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.085102 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-slash" (OuterVolumeSpecName: "host-slash") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.090702 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8149892b-eb94-4d2d-99b3-cebf34efa32a-kube-api-access-mph6j" (OuterVolumeSpecName: "kube-api-access-mph6j") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "kube-api-access-mph6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.094866 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.096912 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7b4ps"] Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.097204 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kube-rbac-proxy-node" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.097279 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kube-rbac-proxy-node" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.097333 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" containerName="registry" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.097386 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" containerName="registry" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.097439 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kubecfg-setup" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.097491 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kubecfg-setup" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.097545 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovn-acl-logging" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.097595 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovn-acl-logging" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.097646 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovnkube-controller" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.097698 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovnkube-controller" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.097745 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="sbdb" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.097793 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="sbdb" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.097843 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="northd" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.102099 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="northd" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.102514 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="nbdb" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.102633 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="nbdb" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.102719 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.102774 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.102844 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovn-controller" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.102916 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovn-controller" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.102317 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "8149892b-eb94-4d2d-99b3-cebf34efa32a" (UID: "8149892b-eb94-4d2d-99b3-cebf34efa32a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.104132 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="nbdb" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.104223 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kube-rbac-proxy-node" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.104288 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovnkube-controller" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.104339 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="sbdb" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.106237 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="northd" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.106276 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fce1184-fa7a-4aab-8cd8-8c0bbc0d7863" containerName="registry" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.106301 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovn-acl-logging" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.106321 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="ovn-controller" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.106338 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.117234 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.183800 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovnkube-config\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.183904 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-systemd-units\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.183956 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184010 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5bwd\" (UniqueName: \"kubernetes.io/projected/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-kube-api-access-j5bwd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184094 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovnkube-script-lib\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184136 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-systemd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184207 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-cni-bin\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184270 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184305 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovn-node-metrics-cert\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184372 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-etc-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184542 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-node-log\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184613 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-log-socket\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184641 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-env-overrides\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184678 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-slash\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184709 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-run-netns\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184750 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184815 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-cni-netd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184852 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-var-lib-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.184886 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-ovn\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185030 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-kubelet\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185171 4948 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185192 4948 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185209 4948 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185225 4948 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185240 4948 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185252 4948 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185266 4948 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-log-socket\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185279 4948 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-slash\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185294 4948 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-node-log\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185307 4948 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185319 4948 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185335 4948 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185352 4948 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185364 4948 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8149892b-eb94-4d2d-99b3-cebf34efa32a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185416 4948 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185433 4948 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8149892b-eb94-4d2d-99b3-cebf34efa32a-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.185446 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mph6j\" (UniqueName: \"kubernetes.io/projected/8149892b-eb94-4d2d-99b3-cebf34efa32a-kube-api-access-mph6j\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286447 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-cni-netd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286506 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-var-lib-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286528 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-ovn\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286563 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-kubelet\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286592 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovnkube-config\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286616 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-systemd-units\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286636 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-cni-netd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286660 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-var-lib-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286647 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286704 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286735 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-kubelet\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286775 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-ovn\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286803 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-systemd-units\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286832 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5bwd\" (UniqueName: \"kubernetes.io/projected/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-kube-api-access-j5bwd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286862 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovnkube-script-lib\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286899 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-systemd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286931 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-cni-bin\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286965 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.286986 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovn-node-metrics-cert\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287126 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-etc-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287186 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-node-log\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287212 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-log-socket\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287213 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-systemd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287231 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-env-overrides\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287271 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-slash\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287295 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-run-netns\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287327 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287424 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-run-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287459 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-cni-bin\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287488 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-run-ovn-kubernetes\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287571 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovnkube-config\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287628 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-log-socket\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287658 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-etc-openvswitch\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287685 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-node-log\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.287710 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-slash\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.288021 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-host-run-netns\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.288244 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-env-overrides\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.288429 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovnkube-script-lib\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.296548 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-ovn-node-metrics-cert\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.302965 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5bwd\" (UniqueName: \"kubernetes.io/projected/3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0-kube-api-access-j5bwd\") pod \"ovnkube-node-7b4ps\" (UID: \"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0\") " pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.432395 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.472247 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lz7z7_cda64a2b-9444-49d3-bee6-21e8c2bae502/kube-multus/0.log" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.472453 4948 generic.go:334] "Generic (PLEG): container finished" podID="cda64a2b-9444-49d3-bee6-21e8c2bae502" containerID="43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7" exitCode=2 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.472557 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lz7z7" event={"ID":"cda64a2b-9444-49d3-bee6-21e8c2bae502","Type":"ContainerDied","Data":"43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.473154 4948 scope.go:117] "RemoveContainer" containerID="43f48d72e1f3a564dd6ecff4ecdc5edaa965aafdf7a325ab78387932d3908ba7" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.473621 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"20104e57833360c004b9b1c238d53e39c1280ef53e59bba3c684798bf145192e"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.481376 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4cnmm_8149892b-eb94-4d2d-99b3-cebf34efa32a/ovn-acl-logging/0.log" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482231 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4cnmm_8149892b-eb94-4d2d-99b3-cebf34efa32a/ovn-controller/0.log" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482663 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" exitCode=0 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482691 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" exitCode=0 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482702 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" exitCode=0 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482711 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" exitCode=0 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482720 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" exitCode=0 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482730 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" exitCode=0 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482739 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900" exitCode=143 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482750 4948 generic.go:334] "Generic (PLEG): container finished" podID="8149892b-eb94-4d2d-99b3-cebf34efa32a" containerID="fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa" exitCode=143 Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482774 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482800 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482814 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482826 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482838 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482849 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482862 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482876 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482883 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482892 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482901 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482908 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482914 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482921 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482928 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482934 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482940 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482945 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.482952 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483024 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483036 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483072 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483079 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483085 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483090 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483096 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483102 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483125 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483131 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483159 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" event={"ID":"8149892b-eb94-4d2d-99b3-cebf34efa32a","Type":"ContainerDied","Data":"e48d84f8b3d426f09f1bab699d96da730be760e50d79a9cd9d503beb7952d1d7"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483168 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483175 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483180 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483185 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483190 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483195 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483221 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483227 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483232 4948 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483249 4948 scope.go:117] "RemoveContainer" containerID="9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.483521 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4cnmm" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.507645 4948 scope.go:117] "RemoveContainer" containerID="292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.518602 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4cnmm"] Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.521506 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4cnmm"] Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.541305 4948 scope.go:117] "RemoveContainer" containerID="664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.558696 4948 scope.go:117] "RemoveContainer" containerID="73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.573170 4948 scope.go:117] "RemoveContainer" containerID="be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.636110 4948 scope.go:117] "RemoveContainer" containerID="25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.654083 4948 scope.go:117] "RemoveContainer" containerID="5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.665146 4948 scope.go:117] "RemoveContainer" containerID="fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.687511 4948 scope.go:117] "RemoveContainer" containerID="fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.703153 4948 scope.go:117] "RemoveContainer" containerID="9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.703513 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": container with ID starting with 9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61 not found: ID does not exist" containerID="9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.703552 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} err="failed to get container status \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": rpc error: code = NotFound desc = could not find container \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": container with ID starting with 9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.703578 4948 scope.go:117] "RemoveContainer" containerID="292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.703992 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": container with ID starting with 292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff not found: ID does not exist" containerID="292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.704019 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} err="failed to get container status \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": rpc error: code = NotFound desc = could not find container \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": container with ID starting with 292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.704037 4948 scope.go:117] "RemoveContainer" containerID="664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.704370 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": container with ID starting with 664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911 not found: ID does not exist" containerID="664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.704397 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} err="failed to get container status \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": rpc error: code = NotFound desc = could not find container \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": container with ID starting with 664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.704417 4948 scope.go:117] "RemoveContainer" containerID="73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.704635 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": container with ID starting with 73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030 not found: ID does not exist" containerID="73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.704661 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} err="failed to get container status \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": rpc error: code = NotFound desc = could not find container \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": container with ID starting with 73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.704679 4948 scope.go:117] "RemoveContainer" containerID="be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.704842 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": container with ID starting with be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105 not found: ID does not exist" containerID="be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.704868 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} err="failed to get container status \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": rpc error: code = NotFound desc = could not find container \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": container with ID starting with be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.704886 4948 scope.go:117] "RemoveContainer" containerID="25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.705171 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": container with ID starting with 25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32 not found: ID does not exist" containerID="25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.705198 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} err="failed to get container status \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": rpc error: code = NotFound desc = could not find container \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": container with ID starting with 25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.705216 4948 scope.go:117] "RemoveContainer" containerID="5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.705514 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": container with ID starting with 5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900 not found: ID does not exist" containerID="5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.705539 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} err="failed to get container status \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": rpc error: code = NotFound desc = could not find container \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": container with ID starting with 5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.705556 4948 scope.go:117] "RemoveContainer" containerID="fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.705744 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": container with ID starting with fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa not found: ID does not exist" containerID="fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.705769 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} err="failed to get container status \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": rpc error: code = NotFound desc = could not find container \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": container with ID starting with fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.705785 4948 scope.go:117] "RemoveContainer" containerID="fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78" Dec 04 17:40:01 crc kubenswrapper[4948]: E1204 17:40:01.706036 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": container with ID starting with fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78 not found: ID does not exist" containerID="fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.706077 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} err="failed to get container status \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": rpc error: code = NotFound desc = could not find container \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": container with ID starting with fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.706094 4948 scope.go:117] "RemoveContainer" containerID="9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.706339 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} err="failed to get container status \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": rpc error: code = NotFound desc = could not find container \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": container with ID starting with 9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.706366 4948 scope.go:117] "RemoveContainer" containerID="292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.706592 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} err="failed to get container status \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": rpc error: code = NotFound desc = could not find container \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": container with ID starting with 292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.706627 4948 scope.go:117] "RemoveContainer" containerID="664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.707013 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} err="failed to get container status \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": rpc error: code = NotFound desc = could not find container \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": container with ID starting with 664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.707070 4948 scope.go:117] "RemoveContainer" containerID="73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.707481 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} err="failed to get container status \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": rpc error: code = NotFound desc = could not find container \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": container with ID starting with 73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.707512 4948 scope.go:117] "RemoveContainer" containerID="be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.707813 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} err="failed to get container status \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": rpc error: code = NotFound desc = could not find container \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": container with ID starting with be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.707837 4948 scope.go:117] "RemoveContainer" containerID="25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.708204 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} err="failed to get container status \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": rpc error: code = NotFound desc = could not find container \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": container with ID starting with 25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.708228 4948 scope.go:117] "RemoveContainer" containerID="5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.708434 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} err="failed to get container status \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": rpc error: code = NotFound desc = could not find container \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": container with ID starting with 5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.708458 4948 scope.go:117] "RemoveContainer" containerID="fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.708775 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} err="failed to get container status \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": rpc error: code = NotFound desc = could not find container \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": container with ID starting with fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.708798 4948 scope.go:117] "RemoveContainer" containerID="fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.709029 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} err="failed to get container status \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": rpc error: code = NotFound desc = could not find container \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": container with ID starting with fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.709058 4948 scope.go:117] "RemoveContainer" containerID="9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.709294 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} err="failed to get container status \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": rpc error: code = NotFound desc = could not find container \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": container with ID starting with 9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.709313 4948 scope.go:117] "RemoveContainer" containerID="292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.709566 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} err="failed to get container status \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": rpc error: code = NotFound desc = could not find container \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": container with ID starting with 292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.709596 4948 scope.go:117] "RemoveContainer" containerID="664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.709896 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} err="failed to get container status \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": rpc error: code = NotFound desc = could not find container \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": container with ID starting with 664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.709916 4948 scope.go:117] "RemoveContainer" containerID="73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.710175 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} err="failed to get container status \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": rpc error: code = NotFound desc = could not find container \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": container with ID starting with 73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.710194 4948 scope.go:117] "RemoveContainer" containerID="be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.710392 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} err="failed to get container status \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": rpc error: code = NotFound desc = could not find container \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": container with ID starting with be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.710410 4948 scope.go:117] "RemoveContainer" containerID="25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.710620 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} err="failed to get container status \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": rpc error: code = NotFound desc = could not find container \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": container with ID starting with 25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.710638 4948 scope.go:117] "RemoveContainer" containerID="5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.710839 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} err="failed to get container status \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": rpc error: code = NotFound desc = could not find container \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": container with ID starting with 5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.710857 4948 scope.go:117] "RemoveContainer" containerID="fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.711069 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} err="failed to get container status \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": rpc error: code = NotFound desc = could not find container \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": container with ID starting with fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.711098 4948 scope.go:117] "RemoveContainer" containerID="fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.711931 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} err="failed to get container status \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": rpc error: code = NotFound desc = could not find container \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": container with ID starting with fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.711962 4948 scope.go:117] "RemoveContainer" containerID="9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.712230 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} err="failed to get container status \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": rpc error: code = NotFound desc = could not find container \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": container with ID starting with 9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.712251 4948 scope.go:117] "RemoveContainer" containerID="292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.712542 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} err="failed to get container status \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": rpc error: code = NotFound desc = could not find container \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": container with ID starting with 292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.712574 4948 scope.go:117] "RemoveContainer" containerID="664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.712783 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} err="failed to get container status \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": rpc error: code = NotFound desc = could not find container \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": container with ID starting with 664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.712802 4948 scope.go:117] "RemoveContainer" containerID="73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713001 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} err="failed to get container status \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": rpc error: code = NotFound desc = could not find container \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": container with ID starting with 73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713018 4948 scope.go:117] "RemoveContainer" containerID="be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713246 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} err="failed to get container status \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": rpc error: code = NotFound desc = could not find container \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": container with ID starting with be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713273 4948 scope.go:117] "RemoveContainer" containerID="25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713511 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} err="failed to get container status \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": rpc error: code = NotFound desc = could not find container \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": container with ID starting with 25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713539 4948 scope.go:117] "RemoveContainer" containerID="5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713734 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900"} err="failed to get container status \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": rpc error: code = NotFound desc = could not find container \"5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900\": container with ID starting with 5b88a42447e2dacf30f0df79ef052caef8a73ac3ecab2435508b47757fa1d900 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713752 4948 scope.go:117] "RemoveContainer" containerID="fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.713990 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa"} err="failed to get container status \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": rpc error: code = NotFound desc = could not find container \"fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa\": container with ID starting with fa61ea26743c58f1d9600dd34de67c1a57771636ce3f7fc564a7fef7d7a95eaa not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.714017 4948 scope.go:117] "RemoveContainer" containerID="fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.714324 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78"} err="failed to get container status \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": rpc error: code = NotFound desc = could not find container \"fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78\": container with ID starting with fb9be1fc74c213f73d3931a0846e6e74b1e80f32f705d6f32a6a64b2bd002b78 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.714359 4948 scope.go:117] "RemoveContainer" containerID="9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.714615 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61"} err="failed to get container status \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": rpc error: code = NotFound desc = could not find container \"9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61\": container with ID starting with 9618900296a1966851a63976ec7b91445f7e54e8db949ebdc193b4110995fb61 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.714644 4948 scope.go:117] "RemoveContainer" containerID="292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.714883 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff"} err="failed to get container status \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": rpc error: code = NotFound desc = could not find container \"292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff\": container with ID starting with 292d898dcacab2f118462ecb2a40d026fa11c16beb226df99090cc5557ad7bff not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.714912 4948 scope.go:117] "RemoveContainer" containerID="664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.715180 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911"} err="failed to get container status \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": rpc error: code = NotFound desc = could not find container \"664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911\": container with ID starting with 664c22adfe0833c352646b6844fffa6f7ab7d658fa7af8309cf66892df5ed911 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.715208 4948 scope.go:117] "RemoveContainer" containerID="73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.715473 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030"} err="failed to get container status \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": rpc error: code = NotFound desc = could not find container \"73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030\": container with ID starting with 73d3ed00e9cae315de7a0dadcd0704210df98a1b9328c0ecdf4aa10b9eb47030 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.715493 4948 scope.go:117] "RemoveContainer" containerID="be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.715738 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105"} err="failed to get container status \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": rpc error: code = NotFound desc = could not find container \"be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105\": container with ID starting with be3bb1b7da42cbdcf81fefb827e78e127895eb70afa26a57e557e6e93ac9c105 not found: ID does not exist" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.715759 4948 scope.go:117] "RemoveContainer" containerID="25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32" Dec 04 17:40:01 crc kubenswrapper[4948]: I1204 17:40:01.716142 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32"} err="failed to get container status \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": rpc error: code = NotFound desc = could not find container \"25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32\": container with ID starting with 25069921aa0ef7da1c7ef2197db4df45f499ed729ed329e9ddbf5566f6d3fb32 not found: ID does not exist" Dec 04 17:40:02 crc kubenswrapper[4948]: I1204 17:40:02.493274 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lz7z7_cda64a2b-9444-49d3-bee6-21e8c2bae502/kube-multus/0.log" Dec 04 17:40:02 crc kubenswrapper[4948]: I1204 17:40:02.493398 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lz7z7" event={"ID":"cda64a2b-9444-49d3-bee6-21e8c2bae502","Type":"ContainerStarted","Data":"d5b395d3ad2b9c9b343092884305e5c6d734ce16f982ea6bec8234e4f98d045a"} Dec 04 17:40:02 crc kubenswrapper[4948]: I1204 17:40:02.495074 4948 generic.go:334] "Generic (PLEG): container finished" podID="3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0" containerID="f74976f8ca5a9adc19ec60c8dab2bb2ed368928906d27c38bf8057c823c7305b" exitCode=0 Dec 04 17:40:02 crc kubenswrapper[4948]: I1204 17:40:02.495169 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerDied","Data":"f74976f8ca5a9adc19ec60c8dab2bb2ed368928906d27c38bf8057c823c7305b"} Dec 04 17:40:02 crc kubenswrapper[4948]: I1204 17:40:02.925078 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8149892b-eb94-4d2d-99b3-cebf34efa32a" path="/var/lib/kubelet/pods/8149892b-eb94-4d2d-99b3-cebf34efa32a/volumes" Dec 04 17:40:03 crc kubenswrapper[4948]: I1204 17:40:03.505778 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"8cf665adcbcf742aa2cd64271d7619a1840dd56cc900825de8515c9ef3a2e33e"} Dec 04 17:40:03 crc kubenswrapper[4948]: I1204 17:40:03.506075 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"aac6885bf037aa05793144f916a497487738945a99c93e80c0c501295285c80d"} Dec 04 17:40:03 crc kubenswrapper[4948]: I1204 17:40:03.506085 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"6a9deea9f1e1da1114f10eb2ec718e4022baf1d778dd82a573cf8b10f280ca14"} Dec 04 17:40:03 crc kubenswrapper[4948]: I1204 17:40:03.506095 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"3258bc84493f5c13d1fe277dae3c062d9f03a542ce39892a8222873667e295fb"} Dec 04 17:40:03 crc kubenswrapper[4948]: I1204 17:40:03.506103 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"bdddf69ede75a3721df16586230d0690917caefaa483bbf634cf3c6a5519cbbf"} Dec 04 17:40:03 crc kubenswrapper[4948]: I1204 17:40:03.506111 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"a4ab158f3d0d0a55800f80bb3ea305ae4bee934da7a933b1fbc2162aebbcce17"} Dec 04 17:40:06 crc kubenswrapper[4948]: I1204 17:40:06.533794 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"5c79843073b4f7ab812ee778b2a9336def99d3d445ff85f31396879a41cff4be"} Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.235583 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9k92p"] Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.236724 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.240126 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.240657 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.240821 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.243123 4948 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-cx8lc" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.269665 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl6lt\" (UniqueName: \"kubernetes.io/projected/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-kube-api-access-bl6lt\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.269745 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-crc-storage\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.270400 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-node-mnt\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.372510 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-node-mnt\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.372671 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl6lt\" (UniqueName: \"kubernetes.io/projected/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-kube-api-access-bl6lt\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.372718 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-crc-storage\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.373441 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-node-mnt\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.374349 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-crc-storage\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.411295 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl6lt\" (UniqueName: \"kubernetes.io/projected/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-kube-api-access-bl6lt\") pod \"crc-storage-crc-9k92p\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: I1204 17:40:07.561727 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: E1204 17:40:07.600130 4948 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9k92p_crc-storage_a7f1c5ee-f673-4a6b-b6df-e7b41f310512_0(b45ba8f118adb8e64ee93f849547a37dd1122a386b17b98c11f0056988b6fa2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 17:40:07 crc kubenswrapper[4948]: E1204 17:40:07.600308 4948 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9k92p_crc-storage_a7f1c5ee-f673-4a6b-b6df-e7b41f310512_0(b45ba8f118adb8e64ee93f849547a37dd1122a386b17b98c11f0056988b6fa2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: E1204 17:40:07.600367 4948 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9k92p_crc-storage_a7f1c5ee-f673-4a6b-b6df-e7b41f310512_0(b45ba8f118adb8e64ee93f849547a37dd1122a386b17b98c11f0056988b6fa2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:07 crc kubenswrapper[4948]: E1204 17:40:07.600444 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9k92p_crc-storage(a7f1c5ee-f673-4a6b-b6df-e7b41f310512)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9k92p_crc-storage(a7f1c5ee-f673-4a6b-b6df-e7b41f310512)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9k92p_crc-storage_a7f1c5ee-f673-4a6b-b6df-e7b41f310512_0(b45ba8f118adb8e64ee93f849547a37dd1122a386b17b98c11f0056988b6fa2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9k92p" podUID="a7f1c5ee-f673-4a6b-b6df-e7b41f310512" Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.549666 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" event={"ID":"3a869f3f-a59e-4e7d-a98c-4a9f941c3aa0","Type":"ContainerStarted","Data":"fdcd778ccc4a194b2bb318243ddfca169d4504f4862fbbb4ce59791a9eb03562"} Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.549956 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.550028 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.550116 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.583743 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.584137 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.584589 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" podStartSLOduration=7.584565649 podStartE2EDuration="7.584565649s" podCreationTimestamp="2025-12-04 17:40:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:40:08.577944426 +0000 UTC m=+819.939018868" watchObservedRunningTime="2025-12-04 17:40:08.584565649 +0000 UTC m=+819.945640061" Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.932075 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9k92p"] Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.932209 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:08 crc kubenswrapper[4948]: I1204 17:40:08.932787 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:08 crc kubenswrapper[4948]: E1204 17:40:08.981573 4948 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9k92p_crc-storage_a7f1c5ee-f673-4a6b-b6df-e7b41f310512_0(74620eaf4f42b92191e5ac16b57113a09e74d854e19b6ffc1c8fa5bdbd765737): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 17:40:08 crc kubenswrapper[4948]: E1204 17:40:08.981664 4948 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9k92p_crc-storage_a7f1c5ee-f673-4a6b-b6df-e7b41f310512_0(74620eaf4f42b92191e5ac16b57113a09e74d854e19b6ffc1c8fa5bdbd765737): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:08 crc kubenswrapper[4948]: E1204 17:40:08.981712 4948 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9k92p_crc-storage_a7f1c5ee-f673-4a6b-b6df-e7b41f310512_0(74620eaf4f42b92191e5ac16b57113a09e74d854e19b6ffc1c8fa5bdbd765737): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:08 crc kubenswrapper[4948]: E1204 17:40:08.981786 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-9k92p_crc-storage(a7f1c5ee-f673-4a6b-b6df-e7b41f310512)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-9k92p_crc-storage(a7f1c5ee-f673-4a6b-b6df-e7b41f310512)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-9k92p_crc-storage_a7f1c5ee-f673-4a6b-b6df-e7b41f310512_0(74620eaf4f42b92191e5ac16b57113a09e74d854e19b6ffc1c8fa5bdbd765737): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-9k92p" podUID="a7f1c5ee-f673-4a6b-b6df-e7b41f310512" Dec 04 17:40:10 crc kubenswrapper[4948]: I1204 17:40:10.624654 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:40:10 crc kubenswrapper[4948]: I1204 17:40:10.625165 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:40:10 crc kubenswrapper[4948]: I1204 17:40:10.625267 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:40:10 crc kubenswrapper[4948]: I1204 17:40:10.626140 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6308f9cf478332f1e14942b84d9243079dd40a48776f7e33bd8faea91d259d32"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:40:10 crc kubenswrapper[4948]: I1204 17:40:10.626259 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://6308f9cf478332f1e14942b84d9243079dd40a48776f7e33bd8faea91d259d32" gracePeriod=600 Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.067535 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8pn5t"] Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.069272 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.078520 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pn5t"] Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.138266 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wfv\" (UniqueName: \"kubernetes.io/projected/5a2d52f0-e231-461c-962e-88dbaed8a7d1-kube-api-access-t6wfv\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.138345 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-utilities\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.138438 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-catalog-content\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.239802 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-utilities\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.239874 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6wfv\" (UniqueName: \"kubernetes.io/projected/5a2d52f0-e231-461c-962e-88dbaed8a7d1-kube-api-access-t6wfv\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.239984 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-catalog-content\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.240850 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-catalog-content\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.240836 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-utilities\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.266108 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6wfv\" (UniqueName: \"kubernetes.io/projected/5a2d52f0-e231-461c-962e-88dbaed8a7d1-kube-api-access-t6wfv\") pod \"redhat-marketplace-8pn5t\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.405400 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: E1204 17:40:11.430966 4948 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8pn5t_openshift-marketplace_5a2d52f0-e231-461c-962e-88dbaed8a7d1_0(e0fc03df0ef96d10607e80f6447bffff0be062336d2cbf71af593de665d13685): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 17:40:11 crc kubenswrapper[4948]: E1204 17:40:11.431096 4948 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8pn5t_openshift-marketplace_5a2d52f0-e231-461c-962e-88dbaed8a7d1_0(e0fc03df0ef96d10607e80f6447bffff0be062336d2cbf71af593de665d13685): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: E1204 17:40:11.431133 4948 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8pn5t_openshift-marketplace_5a2d52f0-e231-461c-962e-88dbaed8a7d1_0(e0fc03df0ef96d10607e80f6447bffff0be062336d2cbf71af593de665d13685): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: E1204 17:40:11.431204 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-8pn5t_openshift-marketplace(5a2d52f0-e231-461c-962e-88dbaed8a7d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-8pn5t_openshift-marketplace(5a2d52f0-e231-461c-962e-88dbaed8a7d1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8pn5t_openshift-marketplace_5a2d52f0-e231-461c-962e-88dbaed8a7d1_0(e0fc03df0ef96d10607e80f6447bffff0be062336d2cbf71af593de665d13685): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-marketplace-8pn5t" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.567767 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: I1204 17:40:11.568440 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: E1204 17:40:11.600122 4948 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8pn5t_openshift-marketplace_5a2d52f0-e231-461c-962e-88dbaed8a7d1_0(e4a3bfffc0b1edd702fd9e9b789cf48fb6dd4e5dff1d50f3bc271ab6e3e9a1b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 17:40:11 crc kubenswrapper[4948]: E1204 17:40:11.600518 4948 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8pn5t_openshift-marketplace_5a2d52f0-e231-461c-962e-88dbaed8a7d1_0(e4a3bfffc0b1edd702fd9e9b789cf48fb6dd4e5dff1d50f3bc271ab6e3e9a1b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: E1204 17:40:11.600547 4948 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8pn5t_openshift-marketplace_5a2d52f0-e231-461c-962e-88dbaed8a7d1_0(e4a3bfffc0b1edd702fd9e9b789cf48fb6dd4e5dff1d50f3bc271ab6e3e9a1b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:11 crc kubenswrapper[4948]: E1204 17:40:11.600609 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-8pn5t_openshift-marketplace(5a2d52f0-e231-461c-962e-88dbaed8a7d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-8pn5t_openshift-marketplace(5a2d52f0-e231-461c-962e-88dbaed8a7d1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-8pn5t_openshift-marketplace_5a2d52f0-e231-461c-962e-88dbaed8a7d1_0(e4a3bfffc0b1edd702fd9e9b789cf48fb6dd4e5dff1d50f3bc271ab6e3e9a1b6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/redhat-marketplace-8pn5t" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" Dec 04 17:40:12 crc kubenswrapper[4948]: I1204 17:40:12.579111 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="6308f9cf478332f1e14942b84d9243079dd40a48776f7e33bd8faea91d259d32" exitCode=0 Dec 04 17:40:12 crc kubenswrapper[4948]: I1204 17:40:12.579188 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"6308f9cf478332f1e14942b84d9243079dd40a48776f7e33bd8faea91d259d32"} Dec 04 17:40:12 crc kubenswrapper[4948]: I1204 17:40:12.579254 4948 scope.go:117] "RemoveContainer" containerID="f3c6c114e8dd7ff1e3b5d99fe1de39a49ef32c8d02345e74e4c0478fcc3c5397" Dec 04 17:40:13 crc kubenswrapper[4948]: I1204 17:40:13.586458 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"f3b577e7a9fb6d063090010b2b72a957463bccee6de5d548495cb9f10c1fd00f"} Dec 04 17:40:19 crc kubenswrapper[4948]: I1204 17:40:19.912762 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:19 crc kubenswrapper[4948]: I1204 17:40:19.914222 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:20 crc kubenswrapper[4948]: I1204 17:40:20.196905 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9k92p"] Dec 04 17:40:20 crc kubenswrapper[4948]: W1204 17:40:20.200368 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7f1c5ee_f673_4a6b_b6df_e7b41f310512.slice/crio-56e5dcd22553d4d78e985ca494dfb37e08ca7dabad81ea01f906307dc540c9ab WatchSource:0}: Error finding container 56e5dcd22553d4d78e985ca494dfb37e08ca7dabad81ea01f906307dc540c9ab: Status 404 returned error can't find the container with id 56e5dcd22553d4d78e985ca494dfb37e08ca7dabad81ea01f906307dc540c9ab Dec 04 17:40:20 crc kubenswrapper[4948]: I1204 17:40:20.202589 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 17:40:20 crc kubenswrapper[4948]: I1204 17:40:20.628607 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9k92p" event={"ID":"a7f1c5ee-f673-4a6b-b6df-e7b41f310512","Type":"ContainerStarted","Data":"56e5dcd22553d4d78e985ca494dfb37e08ca7dabad81ea01f906307dc540c9ab"} Dec 04 17:40:22 crc kubenswrapper[4948]: I1204 17:40:22.646872 4948 generic.go:334] "Generic (PLEG): container finished" podID="a7f1c5ee-f673-4a6b-b6df-e7b41f310512" containerID="cd244c9da983f3984f926831ece779d0f49234d6fd722f26e03f34c5f69ba2dd" exitCode=0 Dec 04 17:40:22 crc kubenswrapper[4948]: I1204 17:40:22.647012 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9k92p" event={"ID":"a7f1c5ee-f673-4a6b-b6df-e7b41f310512","Type":"ContainerDied","Data":"cd244c9da983f3984f926831ece779d0f49234d6fd722f26e03f34c5f69ba2dd"} Dec 04 17:40:23 crc kubenswrapper[4948]: I1204 17:40:23.932101 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.033135 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl6lt\" (UniqueName: \"kubernetes.io/projected/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-kube-api-access-bl6lt\") pod \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.033379 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-crc-storage\") pod \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.033432 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-node-mnt\") pod \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\" (UID: \"a7f1c5ee-f673-4a6b-b6df-e7b41f310512\") " Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.033617 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a7f1c5ee-f673-4a6b-b6df-e7b41f310512" (UID: "a7f1c5ee-f673-4a6b-b6df-e7b41f310512"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.033848 4948 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.038645 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-kube-api-access-bl6lt" (OuterVolumeSpecName: "kube-api-access-bl6lt") pod "a7f1c5ee-f673-4a6b-b6df-e7b41f310512" (UID: "a7f1c5ee-f673-4a6b-b6df-e7b41f310512"). InnerVolumeSpecName "kube-api-access-bl6lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.062689 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a7f1c5ee-f673-4a6b-b6df-e7b41f310512" (UID: "a7f1c5ee-f673-4a6b-b6df-e7b41f310512"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.135340 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl6lt\" (UniqueName: \"kubernetes.io/projected/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-kube-api-access-bl6lt\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.135404 4948 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a7f1c5ee-f673-4a6b-b6df-e7b41f310512-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.665292 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9k92p" event={"ID":"a7f1c5ee-f673-4a6b-b6df-e7b41f310512","Type":"ContainerDied","Data":"56e5dcd22553d4d78e985ca494dfb37e08ca7dabad81ea01f906307dc540c9ab"} Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.665679 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56e5dcd22553d4d78e985ca494dfb37e08ca7dabad81ea01f906307dc540c9ab" Dec 04 17:40:24 crc kubenswrapper[4948]: I1204 17:40:24.665478 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9k92p" Dec 04 17:40:25 crc kubenswrapper[4948]: I1204 17:40:25.913250 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:25 crc kubenswrapper[4948]: I1204 17:40:25.914276 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:26 crc kubenswrapper[4948]: I1204 17:40:26.364900 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pn5t"] Dec 04 17:40:26 crc kubenswrapper[4948]: W1204 17:40:26.368499 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a2d52f0_e231_461c_962e_88dbaed8a7d1.slice/crio-5084371391c3a4741d19a3cc25b0824db8b53dc8b7af344ba5422a09171d1671 WatchSource:0}: Error finding container 5084371391c3a4741d19a3cc25b0824db8b53dc8b7af344ba5422a09171d1671: Status 404 returned error can't find the container with id 5084371391c3a4741d19a3cc25b0824db8b53dc8b7af344ba5422a09171d1671 Dec 04 17:40:26 crc kubenswrapper[4948]: I1204 17:40:26.678560 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pn5t" event={"ID":"5a2d52f0-e231-461c-962e-88dbaed8a7d1","Type":"ContainerStarted","Data":"5084371391c3a4741d19a3cc25b0824db8b53dc8b7af344ba5422a09171d1671"} Dec 04 17:40:27 crc kubenswrapper[4948]: I1204 17:40:27.685686 4948 generic.go:334] "Generic (PLEG): container finished" podID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerID="bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df" exitCode=0 Dec 04 17:40:27 crc kubenswrapper[4948]: I1204 17:40:27.685750 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pn5t" event={"ID":"5a2d52f0-e231-461c-962e-88dbaed8a7d1","Type":"ContainerDied","Data":"bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df"} Dec 04 17:40:28 crc kubenswrapper[4948]: I1204 17:40:28.695472 4948 generic.go:334] "Generic (PLEG): container finished" podID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerID="086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880" exitCode=0 Dec 04 17:40:28 crc kubenswrapper[4948]: I1204 17:40:28.695591 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pn5t" event={"ID":"5a2d52f0-e231-461c-962e-88dbaed8a7d1","Type":"ContainerDied","Data":"086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880"} Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.311710 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8mwr"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.312298 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b8mwr" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="registry-server" containerID="cri-o://89108754633e4e4f5c9ff509dd230d63bc42aa9d84ef848c5b23ca0db3409b2c" gracePeriod=30 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.335074 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgkmz"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.335751 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xgkmz" podUID="a4a41f24-a106-4070-8656-9344de9df965" containerName="registry-server" containerID="cri-o://70ac6c7590551d7b66a364b208ee2b9c1c4aace27e5f4a20cd395be26dd58306" gracePeriod=30 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.339861 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-526nd"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.340120 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" podUID="93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" containerName="marketplace-operator" containerID="cri-o://479097f7d565e8284acd2ac5d3af8e34377b8d4d864ce9f481ddb1743bba1e75" gracePeriod=30 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.351876 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pn5t"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.356851 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lqjx9"] Dec 04 17:40:29 crc kubenswrapper[4948]: E1204 17:40:29.357323 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f1c5ee-f673-4a6b-b6df-e7b41f310512" containerName="storage" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.358898 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f1c5ee-f673-4a6b-b6df-e7b41f310512" containerName="storage" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.366176 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f1c5ee-f673-4a6b-b6df-e7b41f310512" containerName="storage" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.367642 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-klrrz"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.368073 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-klrrz" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerName="registry-server" containerID="cri-o://2e95457a9abf3231097f156ef500f258026b5f0d3e7e85d93aa573585040bd91" gracePeriod=30 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.368735 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.370672 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lqjx9"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.378773 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sjh7l"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.384532 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.385297 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m67vs"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.386418 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m67vs" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="registry-server" containerID="cri-o://77191cd3e5654d54a6ecaf032bc2061e548410ddca0b9980132e7cc1b111acb0" gracePeriod=30 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.389777 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-b8mwr" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="registry-server" probeResult="failure" output="" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.399259 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjh7l"] Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.403697 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-b8mwr" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="registry-server" probeResult="failure" output="" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.405118 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-utilities\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.405232 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csp4j\" (UniqueName: \"kubernetes.io/projected/4c865fb9-4b3d-4753-8d2b-281e79ae6724-kube-api-access-csp4j\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.405311 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-catalog-content\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.506567 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-utilities\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.506617 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2tp7\" (UniqueName: \"kubernetes.io/projected/8a118304-78c9-447c-84af-57843d1f901d-kube-api-access-b2tp7\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.506650 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csp4j\" (UniqueName: \"kubernetes.io/projected/4c865fb9-4b3d-4753-8d2b-281e79ae6724-kube-api-access-csp4j\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.506674 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-catalog-content\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.506715 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a118304-78c9-447c-84af-57843d1f901d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.506749 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a118304-78c9-447c-84af-57843d1f901d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.507157 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-utilities\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.507177 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-catalog-content\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.569771 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csp4j\" (UniqueName: \"kubernetes.io/projected/4c865fb9-4b3d-4753-8d2b-281e79ae6724-kube-api-access-csp4j\") pod \"certified-operators-sjh7l\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.607820 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2tp7\" (UniqueName: \"kubernetes.io/projected/8a118304-78c9-447c-84af-57843d1f901d-kube-api-access-b2tp7\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.607879 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a118304-78c9-447c-84af-57843d1f901d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.607907 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a118304-78c9-447c-84af-57843d1f901d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.610556 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a118304-78c9-447c-84af-57843d1f901d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.611184 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8a118304-78c9-447c-84af-57843d1f901d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.625894 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2tp7\" (UniqueName: \"kubernetes.io/projected/8a118304-78c9-447c-84af-57843d1f901d-kube-api-access-b2tp7\") pod \"marketplace-operator-79b997595-lqjx9\" (UID: \"8a118304-78c9-447c-84af-57843d1f901d\") " pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.690653 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.705789 4948 generic.go:334] "Generic (PLEG): container finished" podID="a4a41f24-a106-4070-8656-9344de9df965" containerID="70ac6c7590551d7b66a364b208ee2b9c1c4aace27e5f4a20cd395be26dd58306" exitCode=0 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.705866 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkmz" event={"ID":"a4a41f24-a106-4070-8656-9344de9df965","Type":"ContainerDied","Data":"70ac6c7590551d7b66a364b208ee2b9c1c4aace27e5f4a20cd395be26dd58306"} Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.707654 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pn5t" event={"ID":"5a2d52f0-e231-461c-962e-88dbaed8a7d1","Type":"ContainerStarted","Data":"ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f"} Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.710065 4948 generic.go:334] "Generic (PLEG): container finished" podID="7321007a-5f13-450f-aefe-187f2f7fccce" containerID="77191cd3e5654d54a6ecaf032bc2061e548410ddca0b9980132e7cc1b111acb0" exitCode=0 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.710121 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m67vs" event={"ID":"7321007a-5f13-450f-aefe-187f2f7fccce","Type":"ContainerDied","Data":"77191cd3e5654d54a6ecaf032bc2061e548410ddca0b9980132e7cc1b111acb0"} Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.718329 4948 generic.go:334] "Generic (PLEG): container finished" podID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerID="2e95457a9abf3231097f156ef500f258026b5f0d3e7e85d93aa573585040bd91" exitCode=0 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.718436 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klrrz" event={"ID":"a1ee2c3b-8e86-4667-a070-d63035fad5a8","Type":"ContainerDied","Data":"2e95457a9abf3231097f156ef500f258026b5f0d3e7e85d93aa573585040bd91"} Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.721776 4948 generic.go:334] "Generic (PLEG): container finished" podID="93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" containerID="479097f7d565e8284acd2ac5d3af8e34377b8d4d864ce9f481ddb1743bba1e75" exitCode=0 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.721842 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" event={"ID":"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691","Type":"ContainerDied","Data":"479097f7d565e8284acd2ac5d3af8e34377b8d4d864ce9f481ddb1743bba1e75"} Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.723435 4948 generic.go:334] "Generic (PLEG): container finished" podID="1c25d318-4040-48ac-89b1-473380694ed3" containerID="89108754633e4e4f5c9ff509dd230d63bc42aa9d84ef848c5b23ca0db3409b2c" exitCode=0 Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.723471 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8mwr" event={"ID":"1c25d318-4040-48ac-89b1-473380694ed3","Type":"ContainerDied","Data":"89108754633e4e4f5c9ff509dd230d63bc42aa9d84ef848c5b23ca0db3409b2c"} Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.723492 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8mwr" event={"ID":"1c25d318-4040-48ac-89b1-473380694ed3","Type":"ContainerDied","Data":"e363ac507e017eb6bb42c83acc2c20d5d417bd4faf9dfecedc421717c08bd0d1"} Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.723513 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e363ac507e017eb6bb42c83acc2c20d5d417bd4faf9dfecedc421717c08bd0d1" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.737883 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8pn5t" podStartSLOduration=17.298321388 podStartE2EDuration="18.737864622s" podCreationTimestamp="2025-12-04 17:40:11 +0000 UTC" firstStartedPulling="2025-12-04 17:40:27.687284214 +0000 UTC m=+839.048358626" lastFinishedPulling="2025-12-04 17:40:29.126827438 +0000 UTC m=+840.487901860" observedRunningTime="2025-12-04 17:40:29.735539974 +0000 UTC m=+841.096614386" watchObservedRunningTime="2025-12-04 17:40:29.737864622 +0000 UTC m=+841.098939034" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.802277 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.818615 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.842491 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.874862 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.876258 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.888527 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:40:29 crc kubenswrapper[4948]: I1204 17:40:29.947017 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lqjx9"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021171 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-utilities\") pod \"1c25d318-4040-48ac-89b1-473380694ed3\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021227 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-catalog-content\") pod \"a4a41f24-a106-4070-8656-9344de9df965\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021264 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-catalog-content\") pod \"7321007a-5f13-450f-aefe-187f2f7fccce\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021307 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-catalog-content\") pod \"1c25d318-4040-48ac-89b1-473380694ed3\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021336 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q9hb\" (UniqueName: \"kubernetes.io/projected/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-kube-api-access-2q9hb\") pod \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021386 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t2bl\" (UniqueName: \"kubernetes.io/projected/1c25d318-4040-48ac-89b1-473380694ed3-kube-api-access-8t2bl\") pod \"1c25d318-4040-48ac-89b1-473380694ed3\" (UID: \"1c25d318-4040-48ac-89b1-473380694ed3\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021425 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-trusted-ca\") pod \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021471 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26bfm\" (UniqueName: \"kubernetes.io/projected/a4a41f24-a106-4070-8656-9344de9df965-kube-api-access-26bfm\") pod \"a4a41f24-a106-4070-8656-9344de9df965\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021504 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-operator-metrics\") pod \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\" (UID: \"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021523 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-catalog-content\") pod \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021547 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slz8n\" (UniqueName: \"kubernetes.io/projected/a1ee2c3b-8e86-4667-a070-d63035fad5a8-kube-api-access-slz8n\") pod \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021595 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-utilities\") pod \"7321007a-5f13-450f-aefe-187f2f7fccce\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021627 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-utilities\") pod \"a4a41f24-a106-4070-8656-9344de9df965\" (UID: \"a4a41f24-a106-4070-8656-9344de9df965\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021658 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z948\" (UniqueName: \"kubernetes.io/projected/7321007a-5f13-450f-aefe-187f2f7fccce-kube-api-access-4z948\") pod \"7321007a-5f13-450f-aefe-187f2f7fccce\" (UID: \"7321007a-5f13-450f-aefe-187f2f7fccce\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.021694 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-utilities\") pod \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\" (UID: \"a1ee2c3b-8e86-4667-a070-d63035fad5a8\") " Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.022164 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-utilities" (OuterVolumeSpecName: "utilities") pod "1c25d318-4040-48ac-89b1-473380694ed3" (UID: "1c25d318-4040-48ac-89b1-473380694ed3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.023272 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" (UID: "93aadeae-ed9a-4dc5-8151-1d3b5fa9d691"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.023423 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-utilities" (OuterVolumeSpecName: "utilities") pod "7321007a-5f13-450f-aefe-187f2f7fccce" (UID: "7321007a-5f13-450f-aefe-187f2f7fccce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.024482 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-utilities" (OuterVolumeSpecName: "utilities") pod "a1ee2c3b-8e86-4667-a070-d63035fad5a8" (UID: "a1ee2c3b-8e86-4667-a070-d63035fad5a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.026644 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-utilities" (OuterVolumeSpecName: "utilities") pod "a4a41f24-a106-4070-8656-9344de9df965" (UID: "a4a41f24-a106-4070-8656-9344de9df965"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.027955 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" (UID: "93aadeae-ed9a-4dc5-8151-1d3b5fa9d691"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.028136 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ee2c3b-8e86-4667-a070-d63035fad5a8-kube-api-access-slz8n" (OuterVolumeSpecName: "kube-api-access-slz8n") pod "a1ee2c3b-8e86-4667-a070-d63035fad5a8" (UID: "a1ee2c3b-8e86-4667-a070-d63035fad5a8"). InnerVolumeSpecName "kube-api-access-slz8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.030673 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c25d318-4040-48ac-89b1-473380694ed3-kube-api-access-8t2bl" (OuterVolumeSpecName: "kube-api-access-8t2bl") pod "1c25d318-4040-48ac-89b1-473380694ed3" (UID: "1c25d318-4040-48ac-89b1-473380694ed3"). InnerVolumeSpecName "kube-api-access-8t2bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.030944 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-kube-api-access-2q9hb" (OuterVolumeSpecName: "kube-api-access-2q9hb") pod "93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" (UID: "93aadeae-ed9a-4dc5-8151-1d3b5fa9d691"). InnerVolumeSpecName "kube-api-access-2q9hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.030994 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7321007a-5f13-450f-aefe-187f2f7fccce-kube-api-access-4z948" (OuterVolumeSpecName: "kube-api-access-4z948") pod "7321007a-5f13-450f-aefe-187f2f7fccce" (UID: "7321007a-5f13-450f-aefe-187f2f7fccce"). InnerVolumeSpecName "kube-api-access-4z948". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.031141 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a41f24-a106-4070-8656-9344de9df965-kube-api-access-26bfm" (OuterVolumeSpecName: "kube-api-access-26bfm") pod "a4a41f24-a106-4070-8656-9344de9df965" (UID: "a4a41f24-a106-4070-8656-9344de9df965"). InnerVolumeSpecName "kube-api-access-26bfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.035704 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sjh7l"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.055408 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1ee2c3b-8e86-4667-a070-d63035fad5a8" (UID: "a1ee2c3b-8e86-4667-a070-d63035fad5a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.100815 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c25d318-4040-48ac-89b1-473380694ed3" (UID: "1c25d318-4040-48ac-89b1-473380694ed3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.105176 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4a41f24-a106-4070-8656-9344de9df965" (UID: "a4a41f24-a106-4070-8656-9344de9df965"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123102 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123148 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123159 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z948\" (UniqueName: \"kubernetes.io/projected/7321007a-5f13-450f-aefe-187f2f7fccce-kube-api-access-4z948\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123169 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123177 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123185 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a41f24-a106-4070-8656-9344de9df965-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123193 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c25d318-4040-48ac-89b1-473380694ed3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123201 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q9hb\" (UniqueName: \"kubernetes.io/projected/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-kube-api-access-2q9hb\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123209 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t2bl\" (UniqueName: \"kubernetes.io/projected/1c25d318-4040-48ac-89b1-473380694ed3-kube-api-access-8t2bl\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123218 4948 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123238 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26bfm\" (UniqueName: \"kubernetes.io/projected/a4a41f24-a106-4070-8656-9344de9df965-kube-api-access-26bfm\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123246 4948 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123255 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1ee2c3b-8e86-4667-a070-d63035fad5a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.123265 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slz8n\" (UniqueName: \"kubernetes.io/projected/a1ee2c3b-8e86-4667-a070-d63035fad5a8-kube-api-access-slz8n\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.184759 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7321007a-5f13-450f-aefe-187f2f7fccce" (UID: "7321007a-5f13-450f-aefe-187f2f7fccce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.226219 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7321007a-5f13-450f-aefe-187f2f7fccce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.734684 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-klrrz" event={"ID":"a1ee2c3b-8e86-4667-a070-d63035fad5a8","Type":"ContainerDied","Data":"022d3cab728ec42696bc71e8c06ea4c3df064bc6bf7f8d33570df143f9c92125"} Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.735134 4948 scope.go:117] "RemoveContainer" containerID="2e95457a9abf3231097f156ef500f258026b5f0d3e7e85d93aa573585040bd91" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.734696 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-klrrz" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736119 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-28b8g"] Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736427 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736451 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736479 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a41f24-a106-4070-8656-9344de9df965" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736499 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a41f24-a106-4070-8656-9344de9df965" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736522 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerName="extract-content" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736540 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerName="extract-content" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736558 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a41f24-a106-4070-8656-9344de9df965" containerName="extract-utilities" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736576 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a41f24-a106-4070-8656-9344de9df965" containerName="extract-utilities" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736608 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736626 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736651 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736669 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736699 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerName="extract-utilities" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736716 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerName="extract-utilities" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736747 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a41f24-a106-4070-8656-9344de9df965" containerName="extract-content" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736764 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a41f24-a106-4070-8656-9344de9df965" containerName="extract-content" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736789 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" containerName="marketplace-operator" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736806 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" containerName="marketplace-operator" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736828 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="extract-content" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736845 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="extract-content" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736867 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="extract-utilities" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736883 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="extract-utilities" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736905 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="extract-utilities" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736922 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="extract-utilities" Dec 04 17:40:30 crc kubenswrapper[4948]: E1204 17:40:30.736940 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="extract-content" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.736957 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="extract-content" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.737964 4948 generic.go:334] "Generic (PLEG): container finished" podID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerID="ef6dde1a0289a1c18889c46323cfa3e504a4740273744c24daa92085a7af366a" exitCode=0 Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.739581 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.739625 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a41f24-a106-4070-8656-9344de9df965" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.739644 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.739666 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" containerName="marketplace-operator" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.739692 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c25d318-4040-48ac-89b1-473380694ed3" containerName="registry-server" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.740972 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh7l" event={"ID":"4c865fb9-4b3d-4753-8d2b-281e79ae6724","Type":"ContainerDied","Data":"ef6dde1a0289a1c18889c46323cfa3e504a4740273744c24daa92085a7af366a"} Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.741081 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh7l" event={"ID":"4c865fb9-4b3d-4753-8d2b-281e79ae6724","Type":"ContainerStarted","Data":"8df1ffcfaa966d7cdfbc74fb47b1cd8378d57f6893682acd27e7a6e5ce8cdfa1"} Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.741234 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.745613 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgkmz" event={"ID":"a4a41f24-a106-4070-8656-9344de9df965","Type":"ContainerDied","Data":"7e6a2753f8859b49546a026df9225ac7a141aa64794794a45efe0157079cf1f6"} Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.745763 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgkmz" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.749514 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" event={"ID":"8a118304-78c9-447c-84af-57843d1f901d","Type":"ContainerStarted","Data":"6e92011dc662730d133087a6e7da9663ee07bec566a7948ffb25e7cf303ef851"} Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.749598 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" event={"ID":"8a118304-78c9-447c-84af-57843d1f901d","Type":"ContainerStarted","Data":"c60e2a23026b5748037718a1088fa64a37ad4fe77f76889838639adfc9ea8d5d"} Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.749761 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.755857 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.758121 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28b8g"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.761289 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m67vs" event={"ID":"7321007a-5f13-450f-aefe-187f2f7fccce","Type":"ContainerDied","Data":"06803ac3d665769db648dcdb9790d1b310a0165fc6379fb14605483f165582a5"} Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.761407 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m67vs" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.767743 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" event={"ID":"93aadeae-ed9a-4dc5-8151-1d3b5fa9d691","Type":"ContainerDied","Data":"1b803e9cd11212a2b3de968464f151d186a9a9b3853911b71086ad1111101806"} Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.767814 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8mwr" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.767830 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-526nd" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.769815 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8pn5t" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerName="registry-server" containerID="cri-o://ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f" gracePeriod=30 Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.775707 4948 scope.go:117] "RemoveContainer" containerID="8b5540ebde591cf265a86997f3e970c8379511ebaae01757bb6cc8a882ed42e9" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.783284 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lqjx9" podStartSLOduration=1.783263364 podStartE2EDuration="1.783263364s" podCreationTimestamp="2025-12-04 17:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:40:30.778191026 +0000 UTC m=+842.139265438" watchObservedRunningTime="2025-12-04 17:40:30.783263364 +0000 UTC m=+842.144337766" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.797077 4948 scope.go:117] "RemoveContainer" containerID="415d35e6292c2f70d118361b3f2350809c3b22d3d98edf56282948d679494bd0" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.810891 4948 scope.go:117] "RemoveContainer" containerID="70ac6c7590551d7b66a364b208ee2b9c1c4aace27e5f4a20cd395be26dd58306" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.836702 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8mwr"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.836746 4948 scope.go:117] "RemoveContainer" containerID="d50cfd819f02966d3447e2cc88676ba40a4a48b9bd42e768d555e2c50a3ac8f7" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.840483 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b8mwr"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.852677 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-klrrz"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.855885 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-klrrz"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.877664 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m67vs"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.879906 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m67vs"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.888140 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgkmz"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.900560 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xgkmz"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.908249 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-526nd"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.911179 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-526nd"] Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.911484 4948 scope.go:117] "RemoveContainer" containerID="398d0f69f53254f3b3e634764278555a9c979d8b0a1c0b01e468ee8a7f2b5fa7" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.920140 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c25d318-4040-48ac-89b1-473380694ed3" path="/var/lib/kubelet/pods/1c25d318-4040-48ac-89b1-473380694ed3/volumes" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.920813 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7321007a-5f13-450f-aefe-187f2f7fccce" path="/var/lib/kubelet/pods/7321007a-5f13-450f-aefe-187f2f7fccce/volumes" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.921525 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93aadeae-ed9a-4dc5-8151-1d3b5fa9d691" path="/var/lib/kubelet/pods/93aadeae-ed9a-4dc5-8151-1d3b5fa9d691/volumes" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.922365 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ee2c3b-8e86-4667-a070-d63035fad5a8" path="/var/lib/kubelet/pods/a1ee2c3b-8e86-4667-a070-d63035fad5a8/volumes" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.922988 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a41f24-a106-4070-8656-9344de9df965" path="/var/lib/kubelet/pods/a4a41f24-a106-4070-8656-9344de9df965/volumes" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.933298 4948 scope.go:117] "RemoveContainer" containerID="77191cd3e5654d54a6ecaf032bc2061e548410ddca0b9980132e7cc1b111acb0" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.937288 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-utilities\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.937342 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-catalog-content\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.937417 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4s8f\" (UniqueName: \"kubernetes.io/projected/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-kube-api-access-q4s8f\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.950845 4948 scope.go:117] "RemoveContainer" containerID="24fbf9343f0215f56669f02b70ae4cd0e19d904521f9bf3bfa07b0f0effb1707" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.968887 4948 scope.go:117] "RemoveContainer" containerID="b0321610bcb85b6233ed60b8f6775010af8d0801ffd5d7db493820f3b3ec7024" Dec 04 17:40:30 crc kubenswrapper[4948]: I1204 17:40:30.984765 4948 scope.go:117] "RemoveContainer" containerID="479097f7d565e8284acd2ac5d3af8e34377b8d4d864ce9f481ddb1743bba1e75" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.039229 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-utilities\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.039274 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-catalog-content\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.039380 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4s8f\" (UniqueName: \"kubernetes.io/projected/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-kube-api-access-q4s8f\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.039691 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-catalog-content\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.039793 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-utilities\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.055605 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.058539 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4s8f\" (UniqueName: \"kubernetes.io/projected/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-kube-api-access-q4s8f\") pod \"community-operators-28b8g\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.071893 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.242478 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-catalog-content\") pod \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.242786 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6wfv\" (UniqueName: \"kubernetes.io/projected/5a2d52f0-e231-461c-962e-88dbaed8a7d1-kube-api-access-t6wfv\") pod \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.242823 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-utilities\") pod \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\" (UID: \"5a2d52f0-e231-461c-962e-88dbaed8a7d1\") " Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.243716 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-utilities" (OuterVolumeSpecName: "utilities") pod "5a2d52f0-e231-461c-962e-88dbaed8a7d1" (UID: "5a2d52f0-e231-461c-962e-88dbaed8a7d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.247651 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2d52f0-e231-461c-962e-88dbaed8a7d1-kube-api-access-t6wfv" (OuterVolumeSpecName: "kube-api-access-t6wfv") pod "5a2d52f0-e231-461c-962e-88dbaed8a7d1" (UID: "5a2d52f0-e231-461c-962e-88dbaed8a7d1"). InnerVolumeSpecName "kube-api-access-t6wfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.264543 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a2d52f0-e231-461c-962e-88dbaed8a7d1" (UID: "5a2d52f0-e231-461c-962e-88dbaed8a7d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.282063 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28b8g"] Dec 04 17:40:31 crc kubenswrapper[4948]: W1204 17:40:31.287889 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0c7d4cb_67a6_40b7_8a90_a1bb14032982.slice/crio-df294c7b0ed94a96706b0e3900f3e110d1bbbafa4c7aa523bb70853d613e45ec WatchSource:0}: Error finding container df294c7b0ed94a96706b0e3900f3e110d1bbbafa4c7aa523bb70853d613e45ec: Status 404 returned error can't find the container with id df294c7b0ed94a96706b0e3900f3e110d1bbbafa4c7aa523bb70853d613e45ec Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.343443 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.343470 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6wfv\" (UniqueName: \"kubernetes.io/projected/5a2d52f0-e231-461c-962e-88dbaed8a7d1-kube-api-access-t6wfv\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.343481 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a2d52f0-e231-461c-962e-88dbaed8a7d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.454305 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7b4ps" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.731721 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ps6x2"] Dec 04 17:40:31 crc kubenswrapper[4948]: E1204 17:40:31.731944 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerName="extract-content" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.731959 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerName="extract-content" Dec 04 17:40:31 crc kubenswrapper[4948]: E1204 17:40:31.731972 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerName="extract-utilities" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.731978 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerName="extract-utilities" Dec 04 17:40:31 crc kubenswrapper[4948]: E1204 17:40:31.731986 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerName="registry-server" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.731992 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerName="registry-server" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.732150 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerName="registry-server" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.732870 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.748974 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73953573-1a1a-49b7-a686-2056cf6e6937-utilities\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.749013 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr27b\" (UniqueName: \"kubernetes.io/projected/73953573-1a1a-49b7-a686-2056cf6e6937-kube-api-access-qr27b\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.749163 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73953573-1a1a-49b7-a686-2056cf6e6937-catalog-content\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.750380 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ps6x2"] Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.813491 4948 generic.go:334] "Generic (PLEG): container finished" podID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" containerID="ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f" exitCode=0 Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.813549 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pn5t" event={"ID":"5a2d52f0-e231-461c-962e-88dbaed8a7d1","Type":"ContainerDied","Data":"ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f"} Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.813603 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pn5t" event={"ID":"5a2d52f0-e231-461c-962e-88dbaed8a7d1","Type":"ContainerDied","Data":"5084371391c3a4741d19a3cc25b0824db8b53dc8b7af344ba5422a09171d1671"} Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.813617 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pn5t" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.813631 4948 scope.go:117] "RemoveContainer" containerID="ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.816452 4948 generic.go:334] "Generic (PLEG): container finished" podID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerID="4779a63729612c7f59d4a4ea01ceaaf082897c9bc9797bc6808ecb84761685d4" exitCode=0 Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.816566 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28b8g" event={"ID":"f0c7d4cb-67a6-40b7-8a90-a1bb14032982","Type":"ContainerDied","Data":"4779a63729612c7f59d4a4ea01ceaaf082897c9bc9797bc6808ecb84761685d4"} Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.816609 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28b8g" event={"ID":"f0c7d4cb-67a6-40b7-8a90-a1bb14032982","Type":"ContainerStarted","Data":"df294c7b0ed94a96706b0e3900f3e110d1bbbafa4c7aa523bb70853d613e45ec"} Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.830658 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh7l" event={"ID":"4c865fb9-4b3d-4753-8d2b-281e79ae6724","Type":"ContainerStarted","Data":"5aa1eec057aa8ed7bfb36de7b1d8104d2a5eb72226a8457a15e13b29ea3a93e7"} Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.849825 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73953573-1a1a-49b7-a686-2056cf6e6937-catalog-content\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.849885 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73953573-1a1a-49b7-a686-2056cf6e6937-utilities\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.849914 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr27b\" (UniqueName: \"kubernetes.io/projected/73953573-1a1a-49b7-a686-2056cf6e6937-kube-api-access-qr27b\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.850613 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73953573-1a1a-49b7-a686-2056cf6e6937-catalog-content\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.850693 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73953573-1a1a-49b7-a686-2056cf6e6937-utilities\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.875448 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr27b\" (UniqueName: \"kubernetes.io/projected/73953573-1a1a-49b7-a686-2056cf6e6937-kube-api-access-qr27b\") pod \"redhat-marketplace-ps6x2\" (UID: \"73953573-1a1a-49b7-a686-2056cf6e6937\") " pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.876908 4948 scope.go:117] "RemoveContainer" containerID="086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.888301 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pn5t"] Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.893097 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pn5t"] Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.901742 4948 scope.go:117] "RemoveContainer" containerID="bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.919313 4948 scope.go:117] "RemoveContainer" containerID="ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f" Dec 04 17:40:31 crc kubenswrapper[4948]: E1204 17:40:31.919640 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f\": container with ID starting with ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f not found: ID does not exist" containerID="ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.919669 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f"} err="failed to get container status \"ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f\": rpc error: code = NotFound desc = could not find container \"ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f\": container with ID starting with ff591f003dbd7e562a590ab9a0170ae8ab91af2b21bf3f0e9ac8a1d6c49a1c8f not found: ID does not exist" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.919691 4948 scope.go:117] "RemoveContainer" containerID="086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880" Dec 04 17:40:31 crc kubenswrapper[4948]: E1204 17:40:31.920147 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880\": container with ID starting with 086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880 not found: ID does not exist" containerID="086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.920170 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880"} err="failed to get container status \"086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880\": rpc error: code = NotFound desc = could not find container \"086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880\": container with ID starting with 086b8b6b9fbcd7b343594cb943bd754c49efd039d96c5e9e5d8d1f0a4c6f7880 not found: ID does not exist" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.920185 4948 scope.go:117] "RemoveContainer" containerID="bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df" Dec 04 17:40:31 crc kubenswrapper[4948]: E1204 17:40:31.920441 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df\": container with ID starting with bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df not found: ID does not exist" containerID="bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df" Dec 04 17:40:31 crc kubenswrapper[4948]: I1204 17:40:31.920490 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df"} err="failed to get container status \"bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df\": rpc error: code = NotFound desc = could not find container \"bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df\": container with ID starting with bbe89bd54eb0b618ec57e13eafdf20b07bfb799b831b8729633601d2afa8b3df not found: ID does not exist" Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.050838 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.307941 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ps6x2"] Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.845593 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28b8g" event={"ID":"f0c7d4cb-67a6-40b7-8a90-a1bb14032982","Type":"ContainerStarted","Data":"ec9d0b1996e4aa35d813cb7c65759cdbd59ff9c714daab92261496aebdb9c91e"} Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.848812 4948 generic.go:334] "Generic (PLEG): container finished" podID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerID="5aa1eec057aa8ed7bfb36de7b1d8104d2a5eb72226a8457a15e13b29ea3a93e7" exitCode=0 Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.848910 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh7l" event={"ID":"4c865fb9-4b3d-4753-8d2b-281e79ae6724","Type":"ContainerDied","Data":"5aa1eec057aa8ed7bfb36de7b1d8104d2a5eb72226a8457a15e13b29ea3a93e7"} Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.848946 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh7l" event={"ID":"4c865fb9-4b3d-4753-8d2b-281e79ae6724","Type":"ContainerStarted","Data":"cb372894875eef5f9c6fe1924c596c667ae07e145eec843617d105997120a5dc"} Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.851405 4948 generic.go:334] "Generic (PLEG): container finished" podID="73953573-1a1a-49b7-a686-2056cf6e6937" containerID="d88b3840d5aa3b76d4860d49902d9cb8f7ac51c13d19dc9ebaf03e01699071d2" exitCode=0 Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.851443 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps6x2" event={"ID":"73953573-1a1a-49b7-a686-2056cf6e6937","Type":"ContainerDied","Data":"d88b3840d5aa3b76d4860d49902d9cb8f7ac51c13d19dc9ebaf03e01699071d2"} Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.851469 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps6x2" event={"ID":"73953573-1a1a-49b7-a686-2056cf6e6937","Type":"ContainerStarted","Data":"7ea7054a9cac04afbee022c1eb4532dd5b17c1146caf2710924bacaf20b17d3c"} Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.884494 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sjh7l" podStartSLOduration=2.373621648 podStartE2EDuration="3.884472777s" podCreationTimestamp="2025-12-04 17:40:29 +0000 UTC" firstStartedPulling="2025-12-04 17:40:30.741724679 +0000 UTC m=+842.102799091" lastFinishedPulling="2025-12-04 17:40:32.252575818 +0000 UTC m=+843.613650220" observedRunningTime="2025-12-04 17:40:32.882637533 +0000 UTC m=+844.243711955" watchObservedRunningTime="2025-12-04 17:40:32.884472777 +0000 UTC m=+844.245547199" Dec 04 17:40:32 crc kubenswrapper[4948]: I1204 17:40:32.921918 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a2d52f0-e231-461c-962e-88dbaed8a7d1" path="/var/lib/kubelet/pods/5a2d52f0-e231-461c-962e-88dbaed8a7d1/volumes" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.141431 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mvf9l"] Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.143777 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.163417 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvf9l"] Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.164313 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stjmg\" (UniqueName: \"kubernetes.io/projected/336ada24-a6eb-405e-ac32-f04009852896-kube-api-access-stjmg\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.164385 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-catalog-content\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.164439 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-utilities\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.265813 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-catalog-content\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.266101 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-utilities\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.266323 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stjmg\" (UniqueName: \"kubernetes.io/projected/336ada24-a6eb-405e-ac32-f04009852896-kube-api-access-stjmg\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.266464 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-catalog-content\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.266480 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-utilities\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.295785 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stjmg\" (UniqueName: \"kubernetes.io/projected/336ada24-a6eb-405e-ac32-f04009852896-kube-api-access-stjmg\") pod \"certified-operators-mvf9l\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.468349 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.860633 4948 generic.go:334] "Generic (PLEG): container finished" podID="73953573-1a1a-49b7-a686-2056cf6e6937" containerID="abc5fe8897837c2fa2073a69042b19c9ae0da03f3609df9023f189c03bb621a5" exitCode=0 Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.860683 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps6x2" event={"ID":"73953573-1a1a-49b7-a686-2056cf6e6937","Type":"ContainerDied","Data":"abc5fe8897837c2fa2073a69042b19c9ae0da03f3609df9023f189c03bb621a5"} Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.864037 4948 generic.go:334] "Generic (PLEG): container finished" podID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerID="ec9d0b1996e4aa35d813cb7c65759cdbd59ff9c714daab92261496aebdb9c91e" exitCode=0 Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.864135 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28b8g" event={"ID":"f0c7d4cb-67a6-40b7-8a90-a1bb14032982","Type":"ContainerDied","Data":"ec9d0b1996e4aa35d813cb7c65759cdbd59ff9c714daab92261496aebdb9c91e"} Dec 04 17:40:33 crc kubenswrapper[4948]: I1204 17:40:33.968418 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mvf9l"] Dec 04 17:40:33 crc kubenswrapper[4948]: W1204 17:40:33.980947 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod336ada24_a6eb_405e_ac32_f04009852896.slice/crio-3442db8d50667a3aa23acd0a1925e43754acfb2fad24f2144b9ef37d80de6140 WatchSource:0}: Error finding container 3442db8d50667a3aa23acd0a1925e43754acfb2fad24f2144b9ef37d80de6140: Status 404 returned error can't find the container with id 3442db8d50667a3aa23acd0a1925e43754acfb2fad24f2144b9ef37d80de6140 Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.133849 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9frz6"] Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.135053 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.140545 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.144424 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9frz6"] Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.278122 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbndg\" (UniqueName: \"kubernetes.io/projected/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-kube-api-access-jbndg\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.278218 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-catalog-content\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.278255 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-utilities\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.379264 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbndg\" (UniqueName: \"kubernetes.io/projected/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-kube-api-access-jbndg\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.379339 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-catalog-content\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.379363 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-utilities\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.379862 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-utilities\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.379919 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-catalog-content\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.403987 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbndg\" (UniqueName: \"kubernetes.io/projected/b6a62a2c-ea79-4cc5-ac96-0b485cda907c-kube-api-access-jbndg\") pod \"redhat-operators-9frz6\" (UID: \"b6a62a2c-ea79-4cc5-ac96-0b485cda907c\") " pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.549133 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grq44"] Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.549480 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.551483 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.558646 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grq44"] Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.682859 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-utilities\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.682918 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s2fd\" (UniqueName: \"kubernetes.io/projected/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-kube-api-access-9s2fd\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.683007 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-catalog-content\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.783901 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-utilities\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.784251 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s2fd\" (UniqueName: \"kubernetes.io/projected/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-kube-api-access-9s2fd\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.784352 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-catalog-content\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.784721 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-catalog-content\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.784723 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-utilities\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.806401 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s2fd\" (UniqueName: \"kubernetes.io/projected/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-kube-api-access-9s2fd\") pod \"redhat-operators-grq44\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.872304 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ps6x2" event={"ID":"73953573-1a1a-49b7-a686-2056cf6e6937","Type":"ContainerStarted","Data":"a80640b822825bfcd5a56b3cc22f95ad87d8852da6b3e4078c9b39aa520914e9"} Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.875710 4948 generic.go:334] "Generic (PLEG): container finished" podID="336ada24-a6eb-405e-ac32-f04009852896" containerID="82bd6d5966c0c1081c9e1c0191f9efbb579e64ad9b3814df1220900b64c1b34a" exitCode=0 Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.875765 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvf9l" event={"ID":"336ada24-a6eb-405e-ac32-f04009852896","Type":"ContainerDied","Data":"82bd6d5966c0c1081c9e1c0191f9efbb579e64ad9b3814df1220900b64c1b34a"} Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.875797 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvf9l" event={"ID":"336ada24-a6eb-405e-ac32-f04009852896","Type":"ContainerStarted","Data":"3442db8d50667a3aa23acd0a1925e43754acfb2fad24f2144b9ef37d80de6140"} Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.894496 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ps6x2" podStartSLOduration=2.233284114 podStartE2EDuration="3.894473538s" podCreationTimestamp="2025-12-04 17:40:31 +0000 UTC" firstStartedPulling="2025-12-04 17:40:32.853960127 +0000 UTC m=+844.215034529" lastFinishedPulling="2025-12-04 17:40:34.515149521 +0000 UTC m=+845.876223953" observedRunningTime="2025-12-04 17:40:34.891372877 +0000 UTC m=+846.252447289" watchObservedRunningTime="2025-12-04 17:40:34.894473538 +0000 UTC m=+846.255547950" Dec 04 17:40:34 crc kubenswrapper[4948]: I1204 17:40:34.956243 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.001326 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9frz6"] Dec 04 17:40:35 crc kubenswrapper[4948]: W1204 17:40:35.012586 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6a62a2c_ea79_4cc5_ac96_0b485cda907c.slice/crio-2522a08d6dcef84398dd27551f6d24b0425e37cd6c9131bc8fddd8d089aa3f31 WatchSource:0}: Error finding container 2522a08d6dcef84398dd27551f6d24b0425e37cd6c9131bc8fddd8d089aa3f31: Status 404 returned error can't find the container with id 2522a08d6dcef84398dd27551f6d24b0425e37cd6c9131bc8fddd8d089aa3f31 Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.185615 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grq44"] Dec 04 17:40:35 crc kubenswrapper[4948]: W1204 17:40:35.235734 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3065ebe_0ae6_4e2a_bb80_6877c44e3e7c.slice/crio-89d038d162cf3e90343a60b878c17adf91f2f4900b26e3a5d971073af6cf3608 WatchSource:0}: Error finding container 89d038d162cf3e90343a60b878c17adf91f2f4900b26e3a5d971073af6cf3608: Status 404 returned error can't find the container with id 89d038d162cf3e90343a60b878c17adf91f2f4900b26e3a5d971073af6cf3608 Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.529068 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k2vf9"] Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.529961 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.540053 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2vf9"] Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.697101 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxxgf\" (UniqueName: \"kubernetes.io/projected/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-kube-api-access-jxxgf\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.697157 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-utilities\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.697192 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-catalog-content\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.798545 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxxgf\" (UniqueName: \"kubernetes.io/projected/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-kube-api-access-jxxgf\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.798626 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-utilities\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.798663 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-catalog-content\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.799239 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-catalog-content\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.799550 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-utilities\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.832701 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxxgf\" (UniqueName: \"kubernetes.io/projected/0293d2b5-dfcf-4589-8464-8f4f1616bd5d-kube-api-access-jxxgf\") pod \"community-operators-k2vf9\" (UID: \"0293d2b5-dfcf-4589-8464-8f4f1616bd5d\") " pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.846127 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.887322 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28b8g" event={"ID":"f0c7d4cb-67a6-40b7-8a90-a1bb14032982","Type":"ContainerStarted","Data":"7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e"} Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.889596 4948 generic.go:334] "Generic (PLEG): container finished" podID="b6a62a2c-ea79-4cc5-ac96-0b485cda907c" containerID="9f1139834e8e4776ef9db58d5ce67a94d78b9a519bfa03938eeba758d5c7e5ad" exitCode=0 Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.889650 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9frz6" event={"ID":"b6a62a2c-ea79-4cc5-ac96-0b485cda907c","Type":"ContainerDied","Data":"9f1139834e8e4776ef9db58d5ce67a94d78b9a519bfa03938eeba758d5c7e5ad"} Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.889702 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9frz6" event={"ID":"b6a62a2c-ea79-4cc5-ac96-0b485cda907c","Type":"ContainerStarted","Data":"2522a08d6dcef84398dd27551f6d24b0425e37cd6c9131bc8fddd8d089aa3f31"} Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.892461 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvf9l" event={"ID":"336ada24-a6eb-405e-ac32-f04009852896","Type":"ContainerStarted","Data":"1c4845ecaa986618e879bf5def3f055aed9f843006b36b7e6703ef5863133124"} Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.894267 4948 generic.go:334] "Generic (PLEG): container finished" podID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerID="082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575" exitCode=0 Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.895266 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grq44" event={"ID":"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c","Type":"ContainerDied","Data":"082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575"} Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.895286 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grq44" event={"ID":"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c","Type":"ContainerStarted","Data":"89d038d162cf3e90343a60b878c17adf91f2f4900b26e3a5d971073af6cf3608"} Dec 04 17:40:35 crc kubenswrapper[4948]: I1204 17:40:35.914615 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-28b8g" podStartSLOduration=3.010884615 podStartE2EDuration="5.914597354s" podCreationTimestamp="2025-12-04 17:40:30 +0000 UTC" firstStartedPulling="2025-12-04 17:40:31.818362394 +0000 UTC m=+843.179436806" lastFinishedPulling="2025-12-04 17:40:34.722075133 +0000 UTC m=+846.083149545" observedRunningTime="2025-12-04 17:40:35.91346068 +0000 UTC m=+847.274535082" watchObservedRunningTime="2025-12-04 17:40:35.914597354 +0000 UTC m=+847.275671756" Dec 04 17:40:36 crc kubenswrapper[4948]: W1204 17:40:36.268819 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0293d2b5_dfcf_4589_8464_8f4f1616bd5d.slice/crio-9a07ffdc013c5784df864549415339dc9f0b2ec76ec427c302c41e6048907abf WatchSource:0}: Error finding container 9a07ffdc013c5784df864549415339dc9f0b2ec76ec427c302c41e6048907abf: Status 404 returned error can't find the container with id 9a07ffdc013c5784df864549415339dc9f0b2ec76ec427c302c41e6048907abf Dec 04 17:40:36 crc kubenswrapper[4948]: I1204 17:40:36.282481 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2vf9"] Dec 04 17:40:36 crc kubenswrapper[4948]: I1204 17:40:36.904005 4948 generic.go:334] "Generic (PLEG): container finished" podID="336ada24-a6eb-405e-ac32-f04009852896" containerID="1c4845ecaa986618e879bf5def3f055aed9f843006b36b7e6703ef5863133124" exitCode=0 Dec 04 17:40:36 crc kubenswrapper[4948]: I1204 17:40:36.904140 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvf9l" event={"ID":"336ada24-a6eb-405e-ac32-f04009852896","Type":"ContainerDied","Data":"1c4845ecaa986618e879bf5def3f055aed9f843006b36b7e6703ef5863133124"} Dec 04 17:40:36 crc kubenswrapper[4948]: I1204 17:40:36.905920 4948 generic.go:334] "Generic (PLEG): container finished" podID="0293d2b5-dfcf-4589-8464-8f4f1616bd5d" containerID="03f8ec49bf1bca93962f45bb90db8789e326a0b3c8f218db90ef51064bef2256" exitCode=0 Dec 04 17:40:36 crc kubenswrapper[4948]: I1204 17:40:36.905998 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2vf9" event={"ID":"0293d2b5-dfcf-4589-8464-8f4f1616bd5d","Type":"ContainerDied","Data":"03f8ec49bf1bca93962f45bb90db8789e326a0b3c8f218db90ef51064bef2256"} Dec 04 17:40:36 crc kubenswrapper[4948]: I1204 17:40:36.906037 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2vf9" event={"ID":"0293d2b5-dfcf-4589-8464-8f4f1616bd5d","Type":"ContainerStarted","Data":"9a07ffdc013c5784df864549415339dc9f0b2ec76ec427c302c41e6048907abf"} Dec 04 17:40:36 crc kubenswrapper[4948]: I1204 17:40:36.911750 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9frz6" event={"ID":"b6a62a2c-ea79-4cc5-ac96-0b485cda907c","Type":"ContainerStarted","Data":"a882a0640c748005c3d344c67a1c8aa41fa1f109d4145d2bebf0267e0df6ad41"} Dec 04 17:40:37 crc kubenswrapper[4948]: I1204 17:40:37.918223 4948 generic.go:334] "Generic (PLEG): container finished" podID="b6a62a2c-ea79-4cc5-ac96-0b485cda907c" containerID="a882a0640c748005c3d344c67a1c8aa41fa1f109d4145d2bebf0267e0df6ad41" exitCode=0 Dec 04 17:40:37 crc kubenswrapper[4948]: I1204 17:40:37.918332 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9frz6" event={"ID":"b6a62a2c-ea79-4cc5-ac96-0b485cda907c","Type":"ContainerDied","Data":"a882a0640c748005c3d344c67a1c8aa41fa1f109d4145d2bebf0267e0df6ad41"} Dec 04 17:40:37 crc kubenswrapper[4948]: I1204 17:40:37.921631 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvf9l" event={"ID":"336ada24-a6eb-405e-ac32-f04009852896","Type":"ContainerStarted","Data":"683f0c8dd6336e2972c4c9e6013948a40aaf02b4f7702c622d325d2d7f112807"} Dec 04 17:40:37 crc kubenswrapper[4948]: I1204 17:40:37.923590 4948 generic.go:334] "Generic (PLEG): container finished" podID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerID="73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308" exitCode=0 Dec 04 17:40:37 crc kubenswrapper[4948]: I1204 17:40:37.923638 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grq44" event={"ID":"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c","Type":"ContainerDied","Data":"73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308"} Dec 04 17:40:38 crc kubenswrapper[4948]: I1204 17:40:38.000097 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mvf9l" podStartSLOduration=2.33867761 podStartE2EDuration="5.000071231s" podCreationTimestamp="2025-12-04 17:40:33 +0000 UTC" firstStartedPulling="2025-12-04 17:40:34.877475137 +0000 UTC m=+846.238549549" lastFinishedPulling="2025-12-04 17:40:37.538868738 +0000 UTC m=+848.899943170" observedRunningTime="2025-12-04 17:40:37.997301589 +0000 UTC m=+849.358376001" watchObservedRunningTime="2025-12-04 17:40:38.000071231 +0000 UTC m=+849.361145663" Dec 04 17:40:38 crc kubenswrapper[4948]: I1204 17:40:38.930473 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grq44" event={"ID":"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c","Type":"ContainerStarted","Data":"8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129"} Dec 04 17:40:38 crc kubenswrapper[4948]: I1204 17:40:38.932105 4948 generic.go:334] "Generic (PLEG): container finished" podID="0293d2b5-dfcf-4589-8464-8f4f1616bd5d" containerID="a38a04f8b3a3afa73dc28dddd63e5f10a596383773248ac818b5470d6408f870" exitCode=0 Dec 04 17:40:38 crc kubenswrapper[4948]: I1204 17:40:38.932146 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2vf9" event={"ID":"0293d2b5-dfcf-4589-8464-8f4f1616bd5d","Type":"ContainerDied","Data":"a38a04f8b3a3afa73dc28dddd63e5f10a596383773248ac818b5470d6408f870"} Dec 04 17:40:38 crc kubenswrapper[4948]: I1204 17:40:38.934387 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9frz6" event={"ID":"b6a62a2c-ea79-4cc5-ac96-0b485cda907c","Type":"ContainerStarted","Data":"b0bd8e3fb83a9a2e66d0afb8313067c46f72e73ac233fc6add6e77866cf4c844"} Dec 04 17:40:38 crc kubenswrapper[4948]: I1204 17:40:38.953105 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grq44" podStartSLOduration=2.481141842 podStartE2EDuration="4.953087158s" podCreationTimestamp="2025-12-04 17:40:34 +0000 UTC" firstStartedPulling="2025-12-04 17:40:35.895972544 +0000 UTC m=+847.257046946" lastFinishedPulling="2025-12-04 17:40:38.36791785 +0000 UTC m=+849.728992262" observedRunningTime="2025-12-04 17:40:38.949640466 +0000 UTC m=+850.310714888" watchObservedRunningTime="2025-12-04 17:40:38.953087158 +0000 UTC m=+850.314161570" Dec 04 17:40:38 crc kubenswrapper[4948]: I1204 17:40:38.966395 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9frz6" podStartSLOduration=2.523379587 podStartE2EDuration="4.966372319s" podCreationTimestamp="2025-12-04 17:40:34 +0000 UTC" firstStartedPulling="2025-12-04 17:40:35.890837973 +0000 UTC m=+847.251912415" lastFinishedPulling="2025-12-04 17:40:38.333830735 +0000 UTC m=+849.694905147" observedRunningTime="2025-12-04 17:40:38.966130982 +0000 UTC m=+850.327205404" watchObservedRunningTime="2025-12-04 17:40:38.966372319 +0000 UTC m=+850.327446731" Dec 04 17:40:39 crc kubenswrapper[4948]: I1204 17:40:39.802795 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:39 crc kubenswrapper[4948]: I1204 17:40:39.802848 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:39 crc kubenswrapper[4948]: I1204 17:40:39.868323 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:39 crc kubenswrapper[4948]: I1204 17:40:39.948958 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2vf9" event={"ID":"0293d2b5-dfcf-4589-8464-8f4f1616bd5d","Type":"ContainerStarted","Data":"7d30cc72af622bf084b3b866caeb22b697572f3ff4d899d06ab357eac5a546fe"} Dec 04 17:40:39 crc kubenswrapper[4948]: I1204 17:40:39.972767 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k2vf9" podStartSLOduration=2.21916639 podStartE2EDuration="4.97274098s" podCreationTimestamp="2025-12-04 17:40:35 +0000 UTC" firstStartedPulling="2025-12-04 17:40:36.90825221 +0000 UTC m=+848.269326632" lastFinishedPulling="2025-12-04 17:40:39.66182682 +0000 UTC m=+851.022901222" observedRunningTime="2025-12-04 17:40:39.963216589 +0000 UTC m=+851.324291011" watchObservedRunningTime="2025-12-04 17:40:39.97274098 +0000 UTC m=+851.333815392" Dec 04 17:40:40 crc kubenswrapper[4948]: I1204 17:40:40.002981 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:41 crc kubenswrapper[4948]: I1204 17:40:41.073309 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:41 crc kubenswrapper[4948]: I1204 17:40:41.073841 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:41 crc kubenswrapper[4948]: I1204 17:40:41.133560 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:41 crc kubenswrapper[4948]: I1204 17:40:41.922567 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjh7l"] Dec 04 17:40:41 crc kubenswrapper[4948]: I1204 17:40:41.960120 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sjh7l" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerName="registry-server" containerID="cri-o://cb372894875eef5f9c6fe1924c596c667ae07e145eec843617d105997120a5dc" gracePeriod=2 Dec 04 17:40:42 crc kubenswrapper[4948]: I1204 17:40:42.018846 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:42 crc kubenswrapper[4948]: I1204 17:40:42.051194 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:42 crc kubenswrapper[4948]: I1204 17:40:42.051274 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:42 crc kubenswrapper[4948]: I1204 17:40:42.090202 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:43 crc kubenswrapper[4948]: I1204 17:40:43.003025 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ps6x2" Dec 04 17:40:43 crc kubenswrapper[4948]: I1204 17:40:43.469066 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:43 crc kubenswrapper[4948]: I1204 17:40:43.469128 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:43 crc kubenswrapper[4948]: I1204 17:40:43.524336 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:44 crc kubenswrapper[4948]: I1204 17:40:44.017579 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:40:44 crc kubenswrapper[4948]: I1204 17:40:44.324022 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-28b8g"] Dec 04 17:40:44 crc kubenswrapper[4948]: I1204 17:40:44.324305 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-28b8g" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="registry-server" containerID="cri-o://7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e" gracePeriod=2 Dec 04 17:40:44 crc kubenswrapper[4948]: I1204 17:40:44.550323 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:44 crc kubenswrapper[4948]: I1204 17:40:44.550391 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:44 crc kubenswrapper[4948]: I1204 17:40:44.599183 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:44 crc kubenswrapper[4948]: I1204 17:40:44.957300 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:44 crc kubenswrapper[4948]: I1204 17:40:44.957379 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:45 crc kubenswrapper[4948]: I1204 17:40:45.034532 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:45 crc kubenswrapper[4948]: I1204 17:40:45.047940 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9frz6" Dec 04 17:40:45 crc kubenswrapper[4948]: I1204 17:40:45.104787 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:45 crc kubenswrapper[4948]: I1204 17:40:45.846616 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:45 crc kubenswrapper[4948]: I1204 17:40:45.846978 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:45 crc kubenswrapper[4948]: I1204 17:40:45.880772 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:48 crc kubenswrapper[4948]: I1204 17:40:48.684697 4948 generic.go:334] "Generic (PLEG): container finished" podID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerID="cb372894875eef5f9c6fe1924c596c667ae07e145eec843617d105997120a5dc" exitCode=0 Dec 04 17:40:48 crc kubenswrapper[4948]: I1204 17:40:48.684800 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh7l" event={"ID":"4c865fb9-4b3d-4753-8d2b-281e79ae6724","Type":"ContainerDied","Data":"cb372894875eef5f9c6fe1924c596c667ae07e145eec843617d105997120a5dc"} Dec 04 17:40:48 crc kubenswrapper[4948]: I1204 17:40:48.726721 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grq44"] Dec 04 17:40:48 crc kubenswrapper[4948]: I1204 17:40:48.745562 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k2vf9" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.557236 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.673317 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-catalog-content\") pod \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.673398 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-utilities\") pod \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.673499 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csp4j\" (UniqueName: \"kubernetes.io/projected/4c865fb9-4b3d-4753-8d2b-281e79ae6724-kube-api-access-csp4j\") pod \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\" (UID: \"4c865fb9-4b3d-4753-8d2b-281e79ae6724\") " Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.674440 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-utilities" (OuterVolumeSpecName: "utilities") pod "4c865fb9-4b3d-4753-8d2b-281e79ae6724" (UID: "4c865fb9-4b3d-4753-8d2b-281e79ae6724"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.684263 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c865fb9-4b3d-4753-8d2b-281e79ae6724-kube-api-access-csp4j" (OuterVolumeSpecName: "kube-api-access-csp4j") pod "4c865fb9-4b3d-4753-8d2b-281e79ae6724" (UID: "4c865fb9-4b3d-4753-8d2b-281e79ae6724"). InnerVolumeSpecName "kube-api-access-csp4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.690938 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sjh7l" event={"ID":"4c865fb9-4b3d-4753-8d2b-281e79ae6724","Type":"ContainerDied","Data":"8df1ffcfaa966d7cdfbc74fb47b1cd8378d57f6893682acd27e7a6e5ce8cdfa1"} Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.691011 4948 scope.go:117] "RemoveContainer" containerID="cb372894875eef5f9c6fe1924c596c667ae07e145eec843617d105997120a5dc" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.691119 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grq44" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerName="registry-server" containerID="cri-o://8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129" gracePeriod=2 Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.691219 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sjh7l" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.730444 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c865fb9-4b3d-4753-8d2b-281e79ae6724" (UID: "4c865fb9-4b3d-4753-8d2b-281e79ae6724"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.775366 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csp4j\" (UniqueName: \"kubernetes.io/projected/4c865fb9-4b3d-4753-8d2b-281e79ae6724-kube-api-access-csp4j\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.775396 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:49 crc kubenswrapper[4948]: I1204 17:40:49.775406 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c865fb9-4b3d-4753-8d2b-281e79ae6724-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:50 crc kubenswrapper[4948]: I1204 17:40:50.041637 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sjh7l"] Dec 04 17:40:50 crc kubenswrapper[4948]: I1204 17:40:50.045783 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sjh7l"] Dec 04 17:40:50 crc kubenswrapper[4948]: I1204 17:40:50.928119 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" path="/var/lib/kubelet/pods/4c865fb9-4b3d-4753-8d2b-281e79ae6724/volumes" Dec 04 17:40:51 crc kubenswrapper[4948]: E1204 17:40:51.073709 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e is running failed: container process not found" containerID="7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 17:40:51 crc kubenswrapper[4948]: E1204 17:40:51.074610 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e is running failed: container process not found" containerID="7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 17:40:51 crc kubenswrapper[4948]: E1204 17:40:51.075472 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e is running failed: container process not found" containerID="7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 17:40:51 crc kubenswrapper[4948]: E1204 17:40:51.075645 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-28b8g" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="registry-server" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.193198 4948 scope.go:117] "RemoveContainer" containerID="5aa1eec057aa8ed7bfb36de7b1d8104d2a5eb72226a8457a15e13b29ea3a93e7" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.217449 4948 generic.go:334] "Generic (PLEG): container finished" podID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerID="7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e" exitCode=0 Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.217500 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28b8g" event={"ID":"f0c7d4cb-67a6-40b7-8a90-a1bb14032982","Type":"ContainerDied","Data":"7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e"} Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.366881 4948 scope.go:117] "RemoveContainer" containerID="ef6dde1a0289a1c18889c46323cfa3e504a4740273744c24daa92085a7af366a" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.473637 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.503840 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4s8f\" (UniqueName: \"kubernetes.io/projected/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-kube-api-access-q4s8f\") pod \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.503890 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-catalog-content\") pod \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.503941 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-utilities\") pod \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\" (UID: \"f0c7d4cb-67a6-40b7-8a90-a1bb14032982\") " Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.504624 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-utilities" (OuterVolumeSpecName: "utilities") pod "f0c7d4cb-67a6-40b7-8a90-a1bb14032982" (UID: "f0c7d4cb-67a6-40b7-8a90-a1bb14032982"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.512232 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-kube-api-access-q4s8f" (OuterVolumeSpecName: "kube-api-access-q4s8f") pod "f0c7d4cb-67a6-40b7-8a90-a1bb14032982" (UID: "f0c7d4cb-67a6-40b7-8a90-a1bb14032982"). InnerVolumeSpecName "kube-api-access-q4s8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.519580 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.569561 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0c7d4cb-67a6-40b7-8a90-a1bb14032982" (UID: "f0c7d4cb-67a6-40b7-8a90-a1bb14032982"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.605013 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-utilities\") pod \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.605189 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s2fd\" (UniqueName: \"kubernetes.io/projected/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-kube-api-access-9s2fd\") pod \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.605267 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-catalog-content\") pod \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\" (UID: \"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c\") " Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.605611 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.605656 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4s8f\" (UniqueName: \"kubernetes.io/projected/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-kube-api-access-q4s8f\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.605684 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0c7d4cb-67a6-40b7-8a90-a1bb14032982-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.606304 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-utilities" (OuterVolumeSpecName: "utilities") pod "d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" (UID: "d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.607728 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-kube-api-access-9s2fd" (OuterVolumeSpecName: "kube-api-access-9s2fd") pod "d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" (UID: "d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c"). InnerVolumeSpecName "kube-api-access-9s2fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.706898 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.706955 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s2fd\" (UniqueName: \"kubernetes.io/projected/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-kube-api-access-9s2fd\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.742399 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" (UID: "d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:40:51 crc kubenswrapper[4948]: I1204 17:40:51.808394 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.224726 4948 generic.go:334] "Generic (PLEG): container finished" podID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerID="8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129" exitCode=0 Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.224767 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grq44" event={"ID":"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c","Type":"ContainerDied","Data":"8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129"} Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.225132 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grq44" event={"ID":"d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c","Type":"ContainerDied","Data":"89d038d162cf3e90343a60b878c17adf91f2f4900b26e3a5d971073af6cf3608"} Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.224874 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grq44" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.225154 4948 scope.go:117] "RemoveContainer" containerID="8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.227841 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28b8g" event={"ID":"f0c7d4cb-67a6-40b7-8a90-a1bb14032982","Type":"ContainerDied","Data":"df294c7b0ed94a96706b0e3900f3e110d1bbbafa4c7aa523bb70853d613e45ec"} Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.227943 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28b8g" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.243618 4948 scope.go:117] "RemoveContainer" containerID="73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.305472 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grq44"] Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.311197 4948 scope.go:117] "RemoveContainer" containerID="082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.313703 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grq44"] Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.338283 4948 scope.go:117] "RemoveContainer" containerID="8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.338574 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-28b8g"] Dec 04 17:40:52 crc kubenswrapper[4948]: E1204 17:40:52.338853 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129\": container with ID starting with 8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129 not found: ID does not exist" containerID="8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.338890 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129"} err="failed to get container status \"8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129\": rpc error: code = NotFound desc = could not find container \"8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129\": container with ID starting with 8de7913d792a2f55cd00ea7d563a334b01bc091c01da89e44e1c00f6b21e0129 not found: ID does not exist" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.338915 4948 scope.go:117] "RemoveContainer" containerID="73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308" Dec 04 17:40:52 crc kubenswrapper[4948]: E1204 17:40:52.339189 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308\": container with ID starting with 73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308 not found: ID does not exist" containerID="73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.339216 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308"} err="failed to get container status \"73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308\": rpc error: code = NotFound desc = could not find container \"73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308\": container with ID starting with 73f3c18a68ded82e6a499d46ad52be72234d85041fcb8f288f15a938d1550308 not found: ID does not exist" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.339233 4948 scope.go:117] "RemoveContainer" containerID="082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575" Dec 04 17:40:52 crc kubenswrapper[4948]: E1204 17:40:52.339521 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575\": container with ID starting with 082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575 not found: ID does not exist" containerID="082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.339545 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575"} err="failed to get container status \"082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575\": rpc error: code = NotFound desc = could not find container \"082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575\": container with ID starting with 082dc5143d8efc33aca295775f6fae641ce9e29677e4f9285e72ad5a0f45f575 not found: ID does not exist" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.339561 4948 scope.go:117] "RemoveContainer" containerID="7562cb2c077ff885bf1cb936ab6aa274c0004a3cd06df6a3c344e16ade72ee5e" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.343269 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-28b8g"] Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.355541 4948 scope.go:117] "RemoveContainer" containerID="ec9d0b1996e4aa35d813cb7c65759cdbd59ff9c714daab92261496aebdb9c91e" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.371211 4948 scope.go:117] "RemoveContainer" containerID="4779a63729612c7f59d4a4ea01ceaaf082897c9bc9797bc6808ecb84761685d4" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.927230 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" path="/var/lib/kubelet/pods/d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c/volumes" Dec 04 17:40:52 crc kubenswrapper[4948]: I1204 17:40:52.928696 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" path="/var/lib/kubelet/pods/f0c7d4cb-67a6-40b7-8a90-a1bb14032982/volumes" Dec 04 17:41:29 crc kubenswrapper[4948]: I1204 17:41:29.415692 4948 scope.go:117] "RemoveContainer" containerID="59b67f5c18c4d670f05dedfe0c024d0de4f1a31063fc40c8a343a9fd1cf656ac" Dec 04 17:41:29 crc kubenswrapper[4948]: I1204 17:41:29.446542 4948 scope.go:117] "RemoveContainer" containerID="89108754633e4e4f5c9ff509dd230d63bc42aa9d84ef848c5b23ca0db3409b2c" Dec 04 17:41:29 crc kubenswrapper[4948]: I1204 17:41:29.477452 4948 scope.go:117] "RemoveContainer" containerID="5debea8774c94fd40c94bc62e49b9b5cf084a570940d077f79aa7e6ecfa3005c" Dec 04 17:42:40 crc kubenswrapper[4948]: I1204 17:42:40.625721 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:42:40 crc kubenswrapper[4948]: I1204 17:42:40.626418 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:43:10 crc kubenswrapper[4948]: I1204 17:43:10.625084 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:43:10 crc kubenswrapper[4948]: I1204 17:43:10.625679 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:43:40 crc kubenswrapper[4948]: I1204 17:43:40.625341 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:43:40 crc kubenswrapper[4948]: I1204 17:43:40.625938 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:43:40 crc kubenswrapper[4948]: I1204 17:43:40.626002 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:43:40 crc kubenswrapper[4948]: I1204 17:43:40.626772 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3b577e7a9fb6d063090010b2b72a957463bccee6de5d548495cb9f10c1fd00f"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:43:40 crc kubenswrapper[4948]: I1204 17:43:40.626870 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://f3b577e7a9fb6d063090010b2b72a957463bccee6de5d548495cb9f10c1fd00f" gracePeriod=600 Dec 04 17:43:41 crc kubenswrapper[4948]: I1204 17:43:41.365559 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="f3b577e7a9fb6d063090010b2b72a957463bccee6de5d548495cb9f10c1fd00f" exitCode=0 Dec 04 17:43:41 crc kubenswrapper[4948]: I1204 17:43:41.365604 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"f3b577e7a9fb6d063090010b2b72a957463bccee6de5d548495cb9f10c1fd00f"} Dec 04 17:43:41 crc kubenswrapper[4948]: I1204 17:43:41.365983 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"889a470646dcbfc4695d99b33c11f25b714a5667a661661c5be2000a77114372"} Dec 04 17:43:41 crc kubenswrapper[4948]: I1204 17:43:41.366011 4948 scope.go:117] "RemoveContainer" containerID="6308f9cf478332f1e14942b84d9243079dd40a48776f7e33bd8faea91d259d32" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.141599 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv"] Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142530 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="extract-content" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142560 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="extract-content" Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142579 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerName="extract-content" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142591 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerName="extract-content" Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142618 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142633 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142654 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="extract-utilities" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142666 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="extract-utilities" Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142691 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142703 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142721 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerName="extract-utilities" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142733 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerName="extract-utilities" Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142750 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerName="extract-content" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142761 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerName="extract-content" Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142780 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142792 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: E1204 17:45:00.142810 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerName="extract-utilities" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142821 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerName="extract-utilities" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.142992 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c7d4cb-67a6-40b7-8a90-a1bb14032982" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.143011 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3065ebe-0ae6-4e2a-bb80-6877c44e3e7c" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.143032 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c865fb9-4b3d-4753-8d2b-281e79ae6724" containerName="registry-server" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.143638 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.147026 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.147183 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.149101 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv"] Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.195748 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59db2ba4-208f-47bc-87f4-3c357f18db23-config-volume\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.196136 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmgxm\" (UniqueName: \"kubernetes.io/projected/59db2ba4-208f-47bc-87f4-3c357f18db23-kube-api-access-bmgxm\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.196179 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59db2ba4-208f-47bc-87f4-3c357f18db23-secret-volume\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.297594 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgxm\" (UniqueName: \"kubernetes.io/projected/59db2ba4-208f-47bc-87f4-3c357f18db23-kube-api-access-bmgxm\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.297649 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59db2ba4-208f-47bc-87f4-3c357f18db23-secret-volume\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.297732 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59db2ba4-208f-47bc-87f4-3c357f18db23-config-volume\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.298998 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59db2ba4-208f-47bc-87f4-3c357f18db23-config-volume\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.304630 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59db2ba4-208f-47bc-87f4-3c357f18db23-secret-volume\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.313849 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmgxm\" (UniqueName: \"kubernetes.io/projected/59db2ba4-208f-47bc-87f4-3c357f18db23-kube-api-access-bmgxm\") pod \"collect-profiles-29414505-tckhv\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.472357 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.689216 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv"] Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.886815 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" event={"ID":"59db2ba4-208f-47bc-87f4-3c357f18db23","Type":"ContainerStarted","Data":"18646656f7c3653af7102e027e29cb28e124f4eb8e9ef366b86bdcbd5c4ea11a"} Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.886853 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" event={"ID":"59db2ba4-208f-47bc-87f4-3c357f18db23","Type":"ContainerStarted","Data":"07d1cc49076adf88c957d141e63be9d94e86076f8e3091ade613c4c5fcd487c2"} Dec 04 17:45:00 crc kubenswrapper[4948]: I1204 17:45:00.902501 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" podStartSLOduration=0.902477744 podStartE2EDuration="902.477744ms" podCreationTimestamp="2025-12-04 17:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:45:00.898747515 +0000 UTC m=+1112.259821917" watchObservedRunningTime="2025-12-04 17:45:00.902477744 +0000 UTC m=+1112.263552146" Dec 04 17:45:01 crc kubenswrapper[4948]: I1204 17:45:01.894487 4948 generic.go:334] "Generic (PLEG): container finished" podID="59db2ba4-208f-47bc-87f4-3c357f18db23" containerID="18646656f7c3653af7102e027e29cb28e124f4eb8e9ef366b86bdcbd5c4ea11a" exitCode=0 Dec 04 17:45:01 crc kubenswrapper[4948]: I1204 17:45:01.894541 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" event={"ID":"59db2ba4-208f-47bc-87f4-3c357f18db23","Type":"ContainerDied","Data":"18646656f7c3653af7102e027e29cb28e124f4eb8e9ef366b86bdcbd5c4ea11a"} Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.186968 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.233687 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59db2ba4-208f-47bc-87f4-3c357f18db23-config-volume\") pod \"59db2ba4-208f-47bc-87f4-3c357f18db23\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.234266 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59db2ba4-208f-47bc-87f4-3c357f18db23-secret-volume\") pod \"59db2ba4-208f-47bc-87f4-3c357f18db23\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.234356 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmgxm\" (UniqueName: \"kubernetes.io/projected/59db2ba4-208f-47bc-87f4-3c357f18db23-kube-api-access-bmgxm\") pod \"59db2ba4-208f-47bc-87f4-3c357f18db23\" (UID: \"59db2ba4-208f-47bc-87f4-3c357f18db23\") " Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.234839 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59db2ba4-208f-47bc-87f4-3c357f18db23-config-volume" (OuterVolumeSpecName: "config-volume") pod "59db2ba4-208f-47bc-87f4-3c357f18db23" (UID: "59db2ba4-208f-47bc-87f4-3c357f18db23"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.239870 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59db2ba4-208f-47bc-87f4-3c357f18db23-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "59db2ba4-208f-47bc-87f4-3c357f18db23" (UID: "59db2ba4-208f-47bc-87f4-3c357f18db23"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.241378 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59db2ba4-208f-47bc-87f4-3c357f18db23-kube-api-access-bmgxm" (OuterVolumeSpecName: "kube-api-access-bmgxm") pod "59db2ba4-208f-47bc-87f4-3c357f18db23" (UID: "59db2ba4-208f-47bc-87f4-3c357f18db23"). InnerVolumeSpecName "kube-api-access-bmgxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.335822 4948 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59db2ba4-208f-47bc-87f4-3c357f18db23-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.335882 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmgxm\" (UniqueName: \"kubernetes.io/projected/59db2ba4-208f-47bc-87f4-3c357f18db23-kube-api-access-bmgxm\") on node \"crc\" DevicePath \"\"" Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.335905 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59db2ba4-208f-47bc-87f4-3c357f18db23-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.909815 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" event={"ID":"59db2ba4-208f-47bc-87f4-3c357f18db23","Type":"ContainerDied","Data":"07d1cc49076adf88c957d141e63be9d94e86076f8e3091ade613c4c5fcd487c2"} Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.909868 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07d1cc49076adf88c957d141e63be9d94e86076f8e3091ade613c4c5fcd487c2" Dec 04 17:45:03 crc kubenswrapper[4948]: I1204 17:45:03.909927 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv" Dec 04 17:45:40 crc kubenswrapper[4948]: I1204 17:45:40.625316 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:45:40 crc kubenswrapper[4948]: I1204 17:45:40.626161 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:46:10 crc kubenswrapper[4948]: I1204 17:46:10.625226 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:46:10 crc kubenswrapper[4948]: I1204 17:46:10.625835 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:46:40 crc kubenswrapper[4948]: I1204 17:46:40.625168 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:46:40 crc kubenswrapper[4948]: I1204 17:46:40.625987 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:46:40 crc kubenswrapper[4948]: I1204 17:46:40.626118 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:46:40 crc kubenswrapper[4948]: I1204 17:46:40.627168 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"889a470646dcbfc4695d99b33c11f25b714a5667a661661c5be2000a77114372"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:46:40 crc kubenswrapper[4948]: I1204 17:46:40.627266 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://889a470646dcbfc4695d99b33c11f25b714a5667a661661c5be2000a77114372" gracePeriod=600 Dec 04 17:46:41 crc kubenswrapper[4948]: I1204 17:46:41.546221 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="889a470646dcbfc4695d99b33c11f25b714a5667a661661c5be2000a77114372" exitCode=0 Dec 04 17:46:41 crc kubenswrapper[4948]: I1204 17:46:41.546336 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"889a470646dcbfc4695d99b33c11f25b714a5667a661661c5be2000a77114372"} Dec 04 17:46:41 crc kubenswrapper[4948]: I1204 17:46:41.546922 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"e233a346ade9cd965a009c66f42b1cc18967a3cc196c7fca4634b21c2b68b2ec"} Dec 04 17:46:41 crc kubenswrapper[4948]: I1204 17:46:41.546964 4948 scope.go:117] "RemoveContainer" containerID="f3b577e7a9fb6d063090010b2b72a957463bccee6de5d548495cb9f10c1fd00f" Dec 04 17:48:40 crc kubenswrapper[4948]: I1204 17:48:40.624686 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:48:40 crc kubenswrapper[4948]: I1204 17:48:40.625368 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.036997 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq"] Dec 04 17:48:55 crc kubenswrapper[4948]: E1204 17:48:55.038692 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59db2ba4-208f-47bc-87f4-3c357f18db23" containerName="collect-profiles" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.038768 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="59db2ba4-208f-47bc-87f4-3c357f18db23" containerName="collect-profiles" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.038928 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="59db2ba4-208f-47bc-87f4-3c357f18db23" containerName="collect-profiles" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.039869 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.044448 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.053600 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq"] Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.163121 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.163167 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wkg6\" (UniqueName: \"kubernetes.io/projected/b34a0876-8eb9-4701-8597-f9020c25f2d8-kube-api-access-7wkg6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.163193 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.264135 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.264527 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wkg6\" (UniqueName: \"kubernetes.io/projected/b34a0876-8eb9-4701-8597-f9020c25f2d8-kube-api-access-7wkg6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.264681 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.264847 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.265160 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.286821 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wkg6\" (UniqueName: \"kubernetes.io/projected/b34a0876-8eb9-4701-8597-f9020c25f2d8-kube-api-access-7wkg6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.361902 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:48:55 crc kubenswrapper[4948]: I1204 17:48:55.562802 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq"] Dec 04 17:48:56 crc kubenswrapper[4948]: I1204 17:48:56.390175 4948 generic.go:334] "Generic (PLEG): container finished" podID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerID="e79d68cbd89fbaa98307344288e2f7747b010268b386b4cbbfab86186ff69553" exitCode=0 Dec 04 17:48:56 crc kubenswrapper[4948]: I1204 17:48:56.390266 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" event={"ID":"b34a0876-8eb9-4701-8597-f9020c25f2d8","Type":"ContainerDied","Data":"e79d68cbd89fbaa98307344288e2f7747b010268b386b4cbbfab86186ff69553"} Dec 04 17:48:56 crc kubenswrapper[4948]: I1204 17:48:56.390557 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" event={"ID":"b34a0876-8eb9-4701-8597-f9020c25f2d8","Type":"ContainerStarted","Data":"c56dd6e4041482aeb22dda7da8df9cc81ff9ae8d02fad303992945ce2392d29f"} Dec 04 17:48:56 crc kubenswrapper[4948]: I1204 17:48:56.392576 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 17:48:58 crc kubenswrapper[4948]: I1204 17:48:58.406866 4948 generic.go:334] "Generic (PLEG): container finished" podID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerID="91bfdd225262ff0979b15715d8fdf2ea0c2d8987ed36a7a6db5cc36465c81c94" exitCode=0 Dec 04 17:48:58 crc kubenswrapper[4948]: I1204 17:48:58.406931 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" event={"ID":"b34a0876-8eb9-4701-8597-f9020c25f2d8","Type":"ContainerDied","Data":"91bfdd225262ff0979b15715d8fdf2ea0c2d8987ed36a7a6db5cc36465c81c94"} Dec 04 17:48:59 crc kubenswrapper[4948]: I1204 17:48:59.417799 4948 generic.go:334] "Generic (PLEG): container finished" podID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerID="9448a8bb920d6704f959e78fbaa01ee98cbed3d4dda6cb85064f39fdb42e4e5b" exitCode=0 Dec 04 17:48:59 crc kubenswrapper[4948]: I1204 17:48:59.417860 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" event={"ID":"b34a0876-8eb9-4701-8597-f9020c25f2d8","Type":"ContainerDied","Data":"9448a8bb920d6704f959e78fbaa01ee98cbed3d4dda6cb85064f39fdb42e4e5b"} Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.689798 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.852511 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-util\") pod \"b34a0876-8eb9-4701-8597-f9020c25f2d8\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.852561 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-bundle\") pod \"b34a0876-8eb9-4701-8597-f9020c25f2d8\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.852660 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wkg6\" (UniqueName: \"kubernetes.io/projected/b34a0876-8eb9-4701-8597-f9020c25f2d8-kube-api-access-7wkg6\") pod \"b34a0876-8eb9-4701-8597-f9020c25f2d8\" (UID: \"b34a0876-8eb9-4701-8597-f9020c25f2d8\") " Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.853302 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-bundle" (OuterVolumeSpecName: "bundle") pod "b34a0876-8eb9-4701-8597-f9020c25f2d8" (UID: "b34a0876-8eb9-4701-8597-f9020c25f2d8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.861036 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34a0876-8eb9-4701-8597-f9020c25f2d8-kube-api-access-7wkg6" (OuterVolumeSpecName: "kube-api-access-7wkg6") pod "b34a0876-8eb9-4701-8597-f9020c25f2d8" (UID: "b34a0876-8eb9-4701-8597-f9020c25f2d8"). InnerVolumeSpecName "kube-api-access-7wkg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.866439 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-util" (OuterVolumeSpecName: "util") pod "b34a0876-8eb9-4701-8597-f9020c25f2d8" (UID: "b34a0876-8eb9-4701-8597-f9020c25f2d8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.954131 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wkg6\" (UniqueName: \"kubernetes.io/projected/b34a0876-8eb9-4701-8597-f9020c25f2d8-kube-api-access-7wkg6\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.954173 4948 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-util\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:00 crc kubenswrapper[4948]: I1204 17:49:00.954186 4948 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b34a0876-8eb9-4701-8597-f9020c25f2d8-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:01 crc kubenswrapper[4948]: I1204 17:49:01.437975 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" event={"ID":"b34a0876-8eb9-4701-8597-f9020c25f2d8","Type":"ContainerDied","Data":"c56dd6e4041482aeb22dda7da8df9cc81ff9ae8d02fad303992945ce2392d29f"} Dec 04 17:49:01 crc kubenswrapper[4948]: I1204 17:49:01.438017 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56dd6e4041482aeb22dda7da8df9cc81ff9ae8d02fad303992945ce2392d29f" Dec 04 17:49:01 crc kubenswrapper[4948]: I1204 17:49:01.438099 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.920947 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf"] Dec 04 17:49:03 crc kubenswrapper[4948]: E1204 17:49:03.921500 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerName="util" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.921514 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerName="util" Dec 04 17:49:03 crc kubenswrapper[4948]: E1204 17:49:03.921528 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerName="extract" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.921535 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerName="extract" Dec 04 17:49:03 crc kubenswrapper[4948]: E1204 17:49:03.921546 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerName="pull" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.921555 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerName="pull" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.921669 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34a0876-8eb9-4701-8597-f9020c25f2d8" containerName="extract" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.922180 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.924531 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.924693 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6w2k4" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.924770 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.932802 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf"] Dec 04 17:49:03 crc kubenswrapper[4948]: I1204 17:49:03.993779 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7gpn\" (UniqueName: \"kubernetes.io/projected/16777efc-db01-42ba-8a24-e966989fb402-kube-api-access-q7gpn\") pod \"nmstate-operator-5b5b58f5c8-r99vf\" (UID: \"16777efc-db01-42ba-8a24-e966989fb402\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf" Dec 04 17:49:04 crc kubenswrapper[4948]: I1204 17:49:04.095132 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7gpn\" (UniqueName: \"kubernetes.io/projected/16777efc-db01-42ba-8a24-e966989fb402-kube-api-access-q7gpn\") pod \"nmstate-operator-5b5b58f5c8-r99vf\" (UID: \"16777efc-db01-42ba-8a24-e966989fb402\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf" Dec 04 17:49:04 crc kubenswrapper[4948]: I1204 17:49:04.116017 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7gpn\" (UniqueName: \"kubernetes.io/projected/16777efc-db01-42ba-8a24-e966989fb402-kube-api-access-q7gpn\") pod \"nmstate-operator-5b5b58f5c8-r99vf\" (UID: \"16777efc-db01-42ba-8a24-e966989fb402\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf" Dec 04 17:49:04 crc kubenswrapper[4948]: I1204 17:49:04.236179 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf" Dec 04 17:49:04 crc kubenswrapper[4948]: I1204 17:49:04.430526 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf"] Dec 04 17:49:04 crc kubenswrapper[4948]: I1204 17:49:04.459081 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf" event={"ID":"16777efc-db01-42ba-8a24-e966989fb402","Type":"ContainerStarted","Data":"42f5b6214aeeda3054ee2f41c69fb655522dde79c820621d7e267c3c554dc54c"} Dec 04 17:49:06 crc kubenswrapper[4948]: I1204 17:49:06.472108 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf" event={"ID":"16777efc-db01-42ba-8a24-e966989fb402","Type":"ContainerStarted","Data":"41fa2061c520267ed7f8d294c91caf91a1e431531c7d00f41be1c018f4d53996"} Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.330915 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-r99vf" podStartSLOduration=2.520288848 podStartE2EDuration="4.330890138s" podCreationTimestamp="2025-12-04 17:49:03 +0000 UTC" firstStartedPulling="2025-12-04 17:49:04.442124181 +0000 UTC m=+1355.803198583" lastFinishedPulling="2025-12-04 17:49:06.252725471 +0000 UTC m=+1357.613799873" observedRunningTime="2025-12-04 17:49:06.492712115 +0000 UTC m=+1357.853786537" watchObservedRunningTime="2025-12-04 17:49:07.330890138 +0000 UTC m=+1358.691964560" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.333360 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.334519 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.336434 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xgmmn" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.339627 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.340621 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.342766 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.346234 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.360032 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.405981 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8tjwj"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.414937 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.435293 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c48807df-003d-4f1b-8819-94dc6017e382-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-qxx5m\" (UID: \"c48807df-003d-4f1b-8819-94dc6017e382\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.435358 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdh9b\" (UniqueName: \"kubernetes.io/projected/c48807df-003d-4f1b-8819-94dc6017e382-kube-api-access-cdh9b\") pod \"nmstate-webhook-5f6d4c5ccb-qxx5m\" (UID: \"c48807df-003d-4f1b-8819-94dc6017e382\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.435520 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-442zc\" (UniqueName: \"kubernetes.io/projected/7781d778-2308-4cbd-aaa0-73588dc5f945-kube-api-access-442zc\") pod \"nmstate-metrics-7f946cbc9-sjx9c\" (UID: \"7781d778-2308-4cbd-aaa0-73588dc5f945\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.484824 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.486162 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.487810 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.490216 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rzbd7" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.491549 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.491731 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.536269 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c48807df-003d-4f1b-8819-94dc6017e382-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-qxx5m\" (UID: \"c48807df-003d-4f1b-8819-94dc6017e382\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.536329 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw26g\" (UniqueName: \"kubernetes.io/projected/b9b89e09-1bdf-47ea-a59a-fa532aae1589-kube-api-access-qw26g\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.536349 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-dbus-socket\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.536367 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-nmstate-lock\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.536389 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-ovs-socket\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.536413 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdh9b\" (UniqueName: \"kubernetes.io/projected/c48807df-003d-4f1b-8819-94dc6017e382-kube-api-access-cdh9b\") pod \"nmstate-webhook-5f6d4c5ccb-qxx5m\" (UID: \"c48807df-003d-4f1b-8819-94dc6017e382\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:07 crc kubenswrapper[4948]: E1204 17:49:07.536517 4948 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.536547 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-442zc\" (UniqueName: \"kubernetes.io/projected/7781d778-2308-4cbd-aaa0-73588dc5f945-kube-api-access-442zc\") pod \"nmstate-metrics-7f946cbc9-sjx9c\" (UID: \"7781d778-2308-4cbd-aaa0-73588dc5f945\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" Dec 04 17:49:07 crc kubenswrapper[4948]: E1204 17:49:07.536637 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c48807df-003d-4f1b-8819-94dc6017e382-tls-key-pair podName:c48807df-003d-4f1b-8819-94dc6017e382 nodeName:}" failed. No retries permitted until 2025-12-04 17:49:08.036597725 +0000 UTC m=+1359.397672177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/c48807df-003d-4f1b-8819-94dc6017e382-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-qxx5m" (UID: "c48807df-003d-4f1b-8819-94dc6017e382") : secret "openshift-nmstate-webhook" not found Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.555422 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdh9b\" (UniqueName: \"kubernetes.io/projected/c48807df-003d-4f1b-8819-94dc6017e382-kube-api-access-cdh9b\") pod \"nmstate-webhook-5f6d4c5ccb-qxx5m\" (UID: \"c48807df-003d-4f1b-8819-94dc6017e382\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.563494 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-442zc\" (UniqueName: \"kubernetes.io/projected/7781d778-2308-4cbd-aaa0-73588dc5f945-kube-api-access-442zc\") pod \"nmstate-metrics-7f946cbc9-sjx9c\" (UID: \"7781d778-2308-4cbd-aaa0-73588dc5f945\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637346 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/08e03b38-421e-4e93-a695-6e090536ecae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637406 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/08e03b38-421e-4e93-a695-6e090536ecae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637507 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n4r7\" (UniqueName: \"kubernetes.io/projected/08e03b38-421e-4e93-a695-6e090536ecae-kube-api-access-9n4r7\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637537 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw26g\" (UniqueName: \"kubernetes.io/projected/b9b89e09-1bdf-47ea-a59a-fa532aae1589-kube-api-access-qw26g\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637561 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-dbus-socket\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637581 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-nmstate-lock\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637611 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-ovs-socket\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637675 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-ovs-socket\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637741 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-nmstate-lock\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.637845 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b9b89e09-1bdf-47ea-a59a-fa532aae1589-dbus-socket\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.658950 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw26g\" (UniqueName: \"kubernetes.io/projected/b9b89e09-1bdf-47ea-a59a-fa532aae1589-kube-api-access-qw26g\") pod \"nmstate-handler-8tjwj\" (UID: \"b9b89e09-1bdf-47ea-a59a-fa532aae1589\") " pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.661400 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.667759 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bd4d65fb6-6f2fl"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.668589 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.692896 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bd4d65fb6-6f2fl"] Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.737382 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738070 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-oauth-config\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738102 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-service-ca\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738126 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n4r7\" (UniqueName: \"kubernetes.io/projected/08e03b38-421e-4e93-a695-6e090536ecae-kube-api-access-9n4r7\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738152 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-oauth-serving-cert\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738185 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-serving-cert\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738222 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/08e03b38-421e-4e93-a695-6e090536ecae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738310 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkv5h\" (UniqueName: \"kubernetes.io/projected/f6eeee92-700e-4bc6-828a-4b395bc982ed-kube-api-access-tkv5h\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738382 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/08e03b38-421e-4e93-a695-6e090536ecae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738428 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-trusted-ca-bundle\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.738498 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-config\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.739407 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/08e03b38-421e-4e93-a695-6e090536ecae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.745806 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/08e03b38-421e-4e93-a695-6e090536ecae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.755858 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n4r7\" (UniqueName: \"kubernetes.io/projected/08e03b38-421e-4e93-a695-6e090536ecae-kube-api-access-9n4r7\") pod \"nmstate-console-plugin-7fbb5f6569-bwmvw\" (UID: \"08e03b38-421e-4e93-a695-6e090536ecae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.804704 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.839773 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-trusted-ca-bundle\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.839830 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-config\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.839875 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-oauth-config\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.839893 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-service-ca\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.839923 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-oauth-serving-cert\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.839954 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-serving-cert\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.839971 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkv5h\" (UniqueName: \"kubernetes.io/projected/f6eeee92-700e-4bc6-828a-4b395bc982ed-kube-api-access-tkv5h\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.841130 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-trusted-ca-bundle\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.845844 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-serving-cert\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.846656 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-config\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.847336 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-service-ca\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.853442 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6eeee92-700e-4bc6-828a-4b395bc982ed-console-oauth-config\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.863023 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6eeee92-700e-4bc6-828a-4b395bc982ed-oauth-serving-cert\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.867032 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkv5h\" (UniqueName: \"kubernetes.io/projected/f6eeee92-700e-4bc6-828a-4b395bc982ed-kube-api-access-tkv5h\") pod \"console-6bd4d65fb6-6f2fl\" (UID: \"f6eeee92-700e-4bc6-828a-4b395bc982ed\") " pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:07 crc kubenswrapper[4948]: I1204 17:49:07.896421 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c"] Dec 04 17:49:07 crc kubenswrapper[4948]: W1204 17:49:07.902286 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7781d778_2308_4cbd_aaa0_73588dc5f945.slice/crio-b243505f3dcf3c2f3c615b5865e7fe1f8d485823d6cd34e5fe2503684e97c865 WatchSource:0}: Error finding container b243505f3dcf3c2f3c615b5865e7fe1f8d485823d6cd34e5fe2503684e97c865: Status 404 returned error can't find the container with id b243505f3dcf3c2f3c615b5865e7fe1f8d485823d6cd34e5fe2503684e97c865 Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.026967 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.042581 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c48807df-003d-4f1b-8819-94dc6017e382-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-qxx5m\" (UID: \"c48807df-003d-4f1b-8819-94dc6017e382\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.045880 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c48807df-003d-4f1b-8819-94dc6017e382-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-qxx5m\" (UID: \"c48807df-003d-4f1b-8819-94dc6017e382\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.054660 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw"] Dec 04 17:49:08 crc kubenswrapper[4948]: W1204 17:49:08.060558 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e03b38_421e_4e93_a695_6e090536ecae.slice/crio-b3d90733a0eca33b808478ef3000a152e114d42ec132b30658b639088e746165 WatchSource:0}: Error finding container b3d90733a0eca33b808478ef3000a152e114d42ec132b30658b639088e746165: Status 404 returned error can't find the container with id b3d90733a0eca33b808478ef3000a152e114d42ec132b30658b639088e746165 Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.189159 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bd4d65fb6-6f2fl"] Dec 04 17:49:08 crc kubenswrapper[4948]: W1204 17:49:08.196804 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6eeee92_700e_4bc6_828a_4b395bc982ed.slice/crio-f459ecfe38dfda21f2536904e65d9a332a70faaa21a02fee238d97236c2db9f9 WatchSource:0}: Error finding container f459ecfe38dfda21f2536904e65d9a332a70faaa21a02fee238d97236c2db9f9: Status 404 returned error can't find the container with id f459ecfe38dfda21f2536904e65d9a332a70faaa21a02fee238d97236c2db9f9 Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.283737 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.491802 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" event={"ID":"7781d778-2308-4cbd-aaa0-73588dc5f945","Type":"ContainerStarted","Data":"b243505f3dcf3c2f3c615b5865e7fe1f8d485823d6cd34e5fe2503684e97c865"} Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.497599 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd4d65fb6-6f2fl" event={"ID":"f6eeee92-700e-4bc6-828a-4b395bc982ed","Type":"ContainerStarted","Data":"1826cdf3b9d3a57cfe667c37814711e68d79fa7f5fe0a64040a9ba7b2946de36"} Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.497661 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bd4d65fb6-6f2fl" event={"ID":"f6eeee92-700e-4bc6-828a-4b395bc982ed","Type":"ContainerStarted","Data":"f459ecfe38dfda21f2536904e65d9a332a70faaa21a02fee238d97236c2db9f9"} Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.498667 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8tjwj" event={"ID":"b9b89e09-1bdf-47ea-a59a-fa532aae1589","Type":"ContainerStarted","Data":"72499cc4a0a426e541be6cc76f5011f14fe6382a28104d333589a252e104d025"} Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.500745 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" event={"ID":"08e03b38-421e-4e93-a695-6e090536ecae","Type":"ContainerStarted","Data":"b3d90733a0eca33b808478ef3000a152e114d42ec132b30658b639088e746165"} Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.520772 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m"] Dec 04 17:49:08 crc kubenswrapper[4948]: I1204 17:49:08.528373 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bd4d65fb6-6f2fl" podStartSLOduration=1.528353157 podStartE2EDuration="1.528353157s" podCreationTimestamp="2025-12-04 17:49:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:49:08.526003301 +0000 UTC m=+1359.887077713" watchObservedRunningTime="2025-12-04 17:49:08.528353157 +0000 UTC m=+1359.889427569" Dec 04 17:49:09 crc kubenswrapper[4948]: I1204 17:49:09.506189 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" event={"ID":"c48807df-003d-4f1b-8819-94dc6017e382","Type":"ContainerStarted","Data":"80dc5f9e7fe4527c7af24b967e41a73e73345e266f4c9440bd55e1b482073756"} Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.513408 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" event={"ID":"c48807df-003d-4f1b-8819-94dc6017e382","Type":"ContainerStarted","Data":"105cf6afac9e603c6d2121410a9d9da60c17fc3c21e08ecd8ce57c3ad8511f8c"} Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.515029 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.516295 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8tjwj" event={"ID":"b9b89e09-1bdf-47ea-a59a-fa532aae1589","Type":"ContainerStarted","Data":"f03a59dd9c148bee3ec5d2cce90a3669a2ec87e76173b262096eb1406151a869"} Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.517281 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.519286 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" event={"ID":"08e03b38-421e-4e93-a695-6e090536ecae","Type":"ContainerStarted","Data":"61fcb693be2cfb969e6f1feec6bf8c3d83fd426df7bbcc89c0d0eb37fcd9d2f5"} Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.559649 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" podStartSLOduration=1.997962653 podStartE2EDuration="3.559628637s" podCreationTimestamp="2025-12-04 17:49:07 +0000 UTC" firstStartedPulling="2025-12-04 17:49:08.532668729 +0000 UTC m=+1359.893743141" lastFinishedPulling="2025-12-04 17:49:10.094334723 +0000 UTC m=+1361.455409125" observedRunningTime="2025-12-04 17:49:10.544220693 +0000 UTC m=+1361.905295115" watchObservedRunningTime="2025-12-04 17:49:10.559628637 +0000 UTC m=+1361.920703039" Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.560181 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bwmvw" podStartSLOduration=1.5257271129999999 podStartE2EDuration="3.560171103s" podCreationTimestamp="2025-12-04 17:49:07 +0000 UTC" firstStartedPulling="2025-12-04 17:49:08.062631621 +0000 UTC m=+1359.423706023" lastFinishedPulling="2025-12-04 17:49:10.097075611 +0000 UTC m=+1361.458150013" observedRunningTime="2025-12-04 17:49:10.557925419 +0000 UTC m=+1361.918999821" watchObservedRunningTime="2025-12-04 17:49:10.560171103 +0000 UTC m=+1361.921245505" Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.574669 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8tjwj" podStartSLOduration=1.239674851 podStartE2EDuration="3.574654281s" podCreationTimestamp="2025-12-04 17:49:07 +0000 UTC" firstStartedPulling="2025-12-04 17:49:07.75779746 +0000 UTC m=+1359.118871862" lastFinishedPulling="2025-12-04 17:49:10.0927769 +0000 UTC m=+1361.453851292" observedRunningTime="2025-12-04 17:49:10.571659027 +0000 UTC m=+1361.932733439" watchObservedRunningTime="2025-12-04 17:49:10.574654281 +0000 UTC m=+1361.935728683" Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.625575 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:49:10 crc kubenswrapper[4948]: I1204 17:49:10.625652 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:49:17 crc kubenswrapper[4948]: I1204 17:49:17.772721 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8tjwj" Dec 04 17:49:18 crc kubenswrapper[4948]: I1204 17:49:18.028192 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:18 crc kubenswrapper[4948]: I1204 17:49:18.028672 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:18 crc kubenswrapper[4948]: I1204 17:49:18.035085 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:18 crc kubenswrapper[4948]: I1204 17:49:18.598745 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bd4d65fb6-6f2fl" Dec 04 17:49:18 crc kubenswrapper[4948]: I1204 17:49:18.666071 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-m5k2z"] Dec 04 17:49:21 crc kubenswrapper[4948]: I1204 17:49:21.612261 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" event={"ID":"7781d778-2308-4cbd-aaa0-73588dc5f945","Type":"ContainerStarted","Data":"41b3121bf8382345c81d5be0f493cac3583d0a36bac6fc78257a57b3d384abc7"} Dec 04 17:49:23 crc kubenswrapper[4948]: I1204 17:49:23.640464 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" event={"ID":"7781d778-2308-4cbd-aaa0-73588dc5f945","Type":"ContainerStarted","Data":"5e165b5017b0b5ad7913494338712bfedf9ae38c91c521be5416c829ef96fa69"} Dec 04 17:49:23 crc kubenswrapper[4948]: I1204 17:49:23.664181 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-sjx9c" podStartSLOduration=1.656499889 podStartE2EDuration="16.664144976s" podCreationTimestamp="2025-12-04 17:49:07 +0000 UTC" firstStartedPulling="2025-12-04 17:49:07.904381701 +0000 UTC m=+1359.265456103" lastFinishedPulling="2025-12-04 17:49:22.912026788 +0000 UTC m=+1374.273101190" observedRunningTime="2025-12-04 17:49:23.657713898 +0000 UTC m=+1375.018788350" watchObservedRunningTime="2025-12-04 17:49:23.664144976 +0000 UTC m=+1375.025219418" Dec 04 17:49:28 crc kubenswrapper[4948]: I1204 17:49:28.294083 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-qxx5m" Dec 04 17:49:40 crc kubenswrapper[4948]: I1204 17:49:40.625171 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:49:40 crc kubenswrapper[4948]: I1204 17:49:40.625728 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:49:40 crc kubenswrapper[4948]: I1204 17:49:40.625777 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:49:40 crc kubenswrapper[4948]: I1204 17:49:40.626234 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e233a346ade9cd965a009c66f42b1cc18967a3cc196c7fca4634b21c2b68b2ec"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:49:40 crc kubenswrapper[4948]: I1204 17:49:40.626296 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://e233a346ade9cd965a009c66f42b1cc18967a3cc196c7fca4634b21c2b68b2ec" gracePeriod=600 Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.387484 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28"] Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.389109 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.390550 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.400485 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28"] Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.531698 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2t67\" (UniqueName: \"kubernetes.io/projected/8cba0165-dc0e-450d-b958-eb2d861e3b15-kube-api-access-z2t67\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.531767 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.531823 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.633307 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.633390 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2t67\" (UniqueName: \"kubernetes.io/projected/8cba0165-dc0e-450d-b958-eb2d861e3b15-kube-api-access-z2t67\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.633442 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.634061 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.634140 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.664591 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2t67\" (UniqueName: \"kubernetes.io/projected/8cba0165-dc0e-450d-b958-eb2d861e3b15-kube-api-access-z2t67\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.708539 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.760839 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="e233a346ade9cd965a009c66f42b1cc18967a3cc196c7fca4634b21c2b68b2ec" exitCode=0 Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.760865 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"e233a346ade9cd965a009c66f42b1cc18967a3cc196c7fca4634b21c2b68b2ec"} Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.760914 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5"} Dec 04 17:49:41 crc kubenswrapper[4948]: I1204 17:49:41.760936 4948 scope.go:117] "RemoveContainer" containerID="889a470646dcbfc4695d99b33c11f25b714a5667a661661c5be2000a77114372" Dec 04 17:49:42 crc kubenswrapper[4948]: I1204 17:49:42.141915 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28"] Dec 04 17:49:42 crc kubenswrapper[4948]: I1204 17:49:42.775229 4948 generic.go:334] "Generic (PLEG): container finished" podID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerID="82d69f82ca96ceade847be1563ac49a302bd35cec28e4c30f535ac353006eac3" exitCode=0 Dec 04 17:49:42 crc kubenswrapper[4948]: I1204 17:49:42.777255 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" event={"ID":"8cba0165-dc0e-450d-b958-eb2d861e3b15","Type":"ContainerDied","Data":"82d69f82ca96ceade847be1563ac49a302bd35cec28e4c30f535ac353006eac3"} Dec 04 17:49:42 crc kubenswrapper[4948]: I1204 17:49:42.777334 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" event={"ID":"8cba0165-dc0e-450d-b958-eb2d861e3b15","Type":"ContainerStarted","Data":"e4d0c318bbba55f7c9cac0bae466e598237e7579166783eb68aa4b4427280170"} Dec 04 17:49:43 crc kubenswrapper[4948]: I1204 17:49:43.705213 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-m5k2z" podUID="f80f2233-6a99-49c2-a8fc-1bb335b2dd79" containerName="console" containerID="cri-o://7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d" gracePeriod=15 Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.415313 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-m5k2z_f80f2233-6a99-49c2-a8fc-1bb335b2dd79/console/0.log" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.415394 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.576526 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-oauth-config\") pod \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.576921 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-config\") pod \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.576963 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-oauth-serving-cert\") pod \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.577003 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-service-ca\") pod \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.577085 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-trusted-ca-bundle\") pod \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.577130 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtp4d\" (UniqueName: \"kubernetes.io/projected/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-kube-api-access-dtp4d\") pod \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.577160 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-serving-cert\") pod \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\" (UID: \"f80f2233-6a99-49c2-a8fc-1bb335b2dd79\") " Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.579959 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f80f2233-6a99-49c2-a8fc-1bb335b2dd79" (UID: "f80f2233-6a99-49c2-a8fc-1bb335b2dd79"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.580575 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-service-ca" (OuterVolumeSpecName: "service-ca") pod "f80f2233-6a99-49c2-a8fc-1bb335b2dd79" (UID: "f80f2233-6a99-49c2-a8fc-1bb335b2dd79"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.580764 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f80f2233-6a99-49c2-a8fc-1bb335b2dd79" (UID: "f80f2233-6a99-49c2-a8fc-1bb335b2dd79"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.581848 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-config" (OuterVolumeSpecName: "console-config") pod "f80f2233-6a99-49c2-a8fc-1bb335b2dd79" (UID: "f80f2233-6a99-49c2-a8fc-1bb335b2dd79"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.584913 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f80f2233-6a99-49c2-a8fc-1bb335b2dd79" (UID: "f80f2233-6a99-49c2-a8fc-1bb335b2dd79"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.586225 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-kube-api-access-dtp4d" (OuterVolumeSpecName: "kube-api-access-dtp4d") pod "f80f2233-6a99-49c2-a8fc-1bb335b2dd79" (UID: "f80f2233-6a99-49c2-a8fc-1bb335b2dd79"). InnerVolumeSpecName "kube-api-access-dtp4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.586287 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f80f2233-6a99-49c2-a8fc-1bb335b2dd79" (UID: "f80f2233-6a99-49c2-a8fc-1bb335b2dd79"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.679408 4948 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.679469 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtp4d\" (UniqueName: \"kubernetes.io/projected/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-kube-api-access-dtp4d\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.679491 4948 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.679511 4948 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.679532 4948 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.679549 4948 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.679567 4948 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80f2233-6a99-49c2-a8fc-1bb335b2dd79-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.792793 4948 generic.go:334] "Generic (PLEG): container finished" podID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerID="6b008e25079ddbad1a0ff97684a9c22815a091a941f67b651870241f56c5c484" exitCode=0 Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.792857 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" event={"ID":"8cba0165-dc0e-450d-b958-eb2d861e3b15","Type":"ContainerDied","Data":"6b008e25079ddbad1a0ff97684a9c22815a091a941f67b651870241f56c5c484"} Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.794282 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-m5k2z_f80f2233-6a99-49c2-a8fc-1bb335b2dd79/console/0.log" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.794322 4948 generic.go:334] "Generic (PLEG): container finished" podID="f80f2233-6a99-49c2-a8fc-1bb335b2dd79" containerID="7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d" exitCode=2 Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.794340 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m5k2z" event={"ID":"f80f2233-6a99-49c2-a8fc-1bb335b2dd79","Type":"ContainerDied","Data":"7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d"} Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.794364 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-m5k2z" event={"ID":"f80f2233-6a99-49c2-a8fc-1bb335b2dd79","Type":"ContainerDied","Data":"08362ca2cf4b27503604d585ec31d395f8785554434340773d17fc0ad06d0a8f"} Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.794379 4948 scope.go:117] "RemoveContainer" containerID="7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.794423 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-m5k2z" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.823622 4948 scope.go:117] "RemoveContainer" containerID="7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d" Dec 04 17:49:44 crc kubenswrapper[4948]: E1204 17:49:44.824303 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d\": container with ID starting with 7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d not found: ID does not exist" containerID="7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.824349 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d"} err="failed to get container status \"7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d\": rpc error: code = NotFound desc = could not find container \"7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d\": container with ID starting with 7f89effc1d08be999b5943e4d6d113986698a16ee72e9582665235b86334335d not found: ID does not exist" Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.849793 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-m5k2z"] Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.853810 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-m5k2z"] Dec 04 17:49:44 crc kubenswrapper[4948]: I1204 17:49:44.922259 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80f2233-6a99-49c2-a8fc-1bb335b2dd79" path="/var/lib/kubelet/pods/f80f2233-6a99-49c2-a8fc-1bb335b2dd79/volumes" Dec 04 17:49:45 crc kubenswrapper[4948]: I1204 17:49:45.825171 4948 generic.go:334] "Generic (PLEG): container finished" podID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerID="982b60771e3f160915b68feb390b8300c687762295e096f70061e30e611afeec" exitCode=0 Dec 04 17:49:45 crc kubenswrapper[4948]: I1204 17:49:45.825472 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" event={"ID":"8cba0165-dc0e-450d-b958-eb2d861e3b15","Type":"ContainerDied","Data":"982b60771e3f160915b68feb390b8300c687762295e096f70061e30e611afeec"} Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.083154 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.221107 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-util\") pod \"8cba0165-dc0e-450d-b958-eb2d861e3b15\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.221154 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-bundle\") pod \"8cba0165-dc0e-450d-b958-eb2d861e3b15\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.221181 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2t67\" (UniqueName: \"kubernetes.io/projected/8cba0165-dc0e-450d-b958-eb2d861e3b15-kube-api-access-z2t67\") pod \"8cba0165-dc0e-450d-b958-eb2d861e3b15\" (UID: \"8cba0165-dc0e-450d-b958-eb2d861e3b15\") " Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.222808 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-bundle" (OuterVolumeSpecName: "bundle") pod "8cba0165-dc0e-450d-b958-eb2d861e3b15" (UID: "8cba0165-dc0e-450d-b958-eb2d861e3b15"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.232349 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cba0165-dc0e-450d-b958-eb2d861e3b15-kube-api-access-z2t67" (OuterVolumeSpecName: "kube-api-access-z2t67") pod "8cba0165-dc0e-450d-b958-eb2d861e3b15" (UID: "8cba0165-dc0e-450d-b958-eb2d861e3b15"). InnerVolumeSpecName "kube-api-access-z2t67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.237459 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-util" (OuterVolumeSpecName: "util") pod "8cba0165-dc0e-450d-b958-eb2d861e3b15" (UID: "8cba0165-dc0e-450d-b958-eb2d861e3b15"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.322577 4948 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-util\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.322610 4948 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8cba0165-dc0e-450d-b958-eb2d861e3b15-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.322624 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2t67\" (UniqueName: \"kubernetes.io/projected/8cba0165-dc0e-450d-b958-eb2d861e3b15-kube-api-access-z2t67\") on node \"crc\" DevicePath \"\"" Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.842769 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" event={"ID":"8cba0165-dc0e-450d-b958-eb2d861e3b15","Type":"ContainerDied","Data":"e4d0c318bbba55f7c9cac0bae466e598237e7579166783eb68aa4b4427280170"} Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.842813 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4d0c318bbba55f7c9cac0bae466e598237e7579166783eb68aa4b4427280170" Dec 04 17:49:47 crc kubenswrapper[4948]: I1204 17:49:47.842838 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.743225 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q"] Dec 04 17:49:56 crc kubenswrapper[4948]: E1204 17:49:56.743818 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerName="extract" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.743829 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerName="extract" Dec 04 17:49:56 crc kubenswrapper[4948]: E1204 17:49:56.743838 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerName="pull" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.743844 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerName="pull" Dec 04 17:49:56 crc kubenswrapper[4948]: E1204 17:49:56.743856 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerName="util" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.743861 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerName="util" Dec 04 17:49:56 crc kubenswrapper[4948]: E1204 17:49:56.743879 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80f2233-6a99-49c2-a8fc-1bb335b2dd79" containerName="console" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.743884 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80f2233-6a99-49c2-a8fc-1bb335b2dd79" containerName="console" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.743981 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cba0165-dc0e-450d-b958-eb2d861e3b15" containerName="extract" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.743990 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80f2233-6a99-49c2-a8fc-1bb335b2dd79" containerName="console" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.744390 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.746509 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.746874 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.747005 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lr7dc" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.747480 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.747974 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.766157 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q"] Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.942675 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7372c6f5-6966-4cdf-a798-514c25eb08c3-webhook-cert\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.942751 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7372c6f5-6966-4cdf-a798-514c25eb08c3-apiservice-cert\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.942846 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mlkk\" (UniqueName: \"kubernetes.io/projected/7372c6f5-6966-4cdf-a798-514c25eb08c3-kube-api-access-8mlkk\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.973077 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs"] Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.973908 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.976278 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9kdbz" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.976339 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.977443 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 04 17:49:56 crc kubenswrapper[4948]: I1204 17:49:56.994902 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs"] Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.043661 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mlkk\" (UniqueName: \"kubernetes.io/projected/7372c6f5-6966-4cdf-a798-514c25eb08c3-kube-api-access-8mlkk\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.043761 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7372c6f5-6966-4cdf-a798-514c25eb08c3-webhook-cert\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.043798 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7372c6f5-6966-4cdf-a798-514c25eb08c3-apiservice-cert\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.051741 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7372c6f5-6966-4cdf-a798-514c25eb08c3-apiservice-cert\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.051838 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7372c6f5-6966-4cdf-a798-514c25eb08c3-webhook-cert\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.066840 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mlkk\" (UniqueName: \"kubernetes.io/projected/7372c6f5-6966-4cdf-a798-514c25eb08c3-kube-api-access-8mlkk\") pod \"metallb-operator-controller-manager-769d6bffcb-ptk7q\" (UID: \"7372c6f5-6966-4cdf-a798-514c25eb08c3\") " pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.144562 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h925p\" (UniqueName: \"kubernetes.io/projected/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-kube-api-access-h925p\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.144624 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-apiservice-cert\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.144650 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-webhook-cert\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.246028 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h925p\" (UniqueName: \"kubernetes.io/projected/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-kube-api-access-h925p\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.246103 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-apiservice-cert\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.246126 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-webhook-cert\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.256887 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-webhook-cert\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.256970 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-apiservice-cert\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.263491 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h925p\" (UniqueName: \"kubernetes.io/projected/9ec6eeb9-ca17-49dc-bec3-c956b8c63c60-kube-api-access-h925p\") pod \"metallb-operator-webhook-server-54cdb97cf7-jfhjs\" (UID: \"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60\") " pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.286294 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.358418 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.779420 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs"] Dec 04 17:49:57 crc kubenswrapper[4948]: W1204 17:49:57.791558 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ec6eeb9_ca17_49dc_bec3_c956b8c63c60.slice/crio-6187ca1cf127381f66422f87cb0b6d524425c433eca82f5e0167eb36e3307cae WatchSource:0}: Error finding container 6187ca1cf127381f66422f87cb0b6d524425c433eca82f5e0167eb36e3307cae: Status 404 returned error can't find the container with id 6187ca1cf127381f66422f87cb0b6d524425c433eca82f5e0167eb36e3307cae Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.896210 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" event={"ID":"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60","Type":"ContainerStarted","Data":"6187ca1cf127381f66422f87cb0b6d524425c433eca82f5e0167eb36e3307cae"} Dec 04 17:49:57 crc kubenswrapper[4948]: I1204 17:49:57.908815 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q"] Dec 04 17:49:57 crc kubenswrapper[4948]: W1204 17:49:57.912523 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7372c6f5_6966_4cdf_a798_514c25eb08c3.slice/crio-4817a757a8f6e5f5f27010a54a281d2baa7a82c9c301534601deba8c11edc866 WatchSource:0}: Error finding container 4817a757a8f6e5f5f27010a54a281d2baa7a82c9c301534601deba8c11edc866: Status 404 returned error can't find the container with id 4817a757a8f6e5f5f27010a54a281d2baa7a82c9c301534601deba8c11edc866 Dec 04 17:49:58 crc kubenswrapper[4948]: I1204 17:49:58.902503 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" event={"ID":"7372c6f5-6966-4cdf-a798-514c25eb08c3","Type":"ContainerStarted","Data":"4817a757a8f6e5f5f27010a54a281d2baa7a82c9c301534601deba8c11edc866"} Dec 04 17:50:02 crc kubenswrapper[4948]: I1204 17:50:02.935899 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" event={"ID":"7372c6f5-6966-4cdf-a798-514c25eb08c3","Type":"ContainerStarted","Data":"779b21579f7fa82e78becb14b21099991b905aaf8cf3554bc173d9a083986e92"} Dec 04 17:50:02 crc kubenswrapper[4948]: I1204 17:50:02.936376 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:50:02 crc kubenswrapper[4948]: I1204 17:50:02.937294 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" event={"ID":"9ec6eeb9-ca17-49dc-bec3-c956b8c63c60","Type":"ContainerStarted","Data":"3bd3496dd5874c2952744db297da16b2d1d610ac00ca9f2aac2f3c0da1a3d01d"} Dec 04 17:50:02 crc kubenswrapper[4948]: I1204 17:50:02.938021 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:50:02 crc kubenswrapper[4948]: I1204 17:50:02.959871 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" podStartSLOduration=2.274170972 podStartE2EDuration="6.959842998s" podCreationTimestamp="2025-12-04 17:49:56 +0000 UTC" firstStartedPulling="2025-12-04 17:49:57.915645313 +0000 UTC m=+1409.276719725" lastFinishedPulling="2025-12-04 17:50:02.601317349 +0000 UTC m=+1413.962391751" observedRunningTime="2025-12-04 17:50:02.952443824 +0000 UTC m=+1414.313518236" watchObservedRunningTime="2025-12-04 17:50:02.959842998 +0000 UTC m=+1414.320917440" Dec 04 17:50:02 crc kubenswrapper[4948]: I1204 17:50:02.981252 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" podStartSLOduration=2.144802126 podStartE2EDuration="6.981230879s" podCreationTimestamp="2025-12-04 17:49:56 +0000 UTC" firstStartedPulling="2025-12-04 17:49:57.794886935 +0000 UTC m=+1409.155961337" lastFinishedPulling="2025-12-04 17:50:02.631315688 +0000 UTC m=+1413.992390090" observedRunningTime="2025-12-04 17:50:02.977863136 +0000 UTC m=+1414.338937588" watchObservedRunningTime="2025-12-04 17:50:02.981230879 +0000 UTC m=+1414.342305291" Dec 04 17:50:17 crc kubenswrapper[4948]: I1204 17:50:17.292496 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-54cdb97cf7-jfhjs" Dec 04 17:50:37 crc kubenswrapper[4948]: I1204 17:50:37.360901 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-769d6bffcb-ptk7q" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.049721 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb"] Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.050688 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.054312 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.054509 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-blwqt" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.054874 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7v8rn"] Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.058281 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.060933 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.062669 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089311 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-reloader\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089359 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76tvt\" (UniqueName: \"kubernetes.io/projected/0ca06785-e8b8-43ba-919d-2a00e88b9092-kube-api-access-76tvt\") pod \"frr-k8s-webhook-server-7fcb986d4-tmtnb\" (UID: \"0ca06785-e8b8-43ba-919d-2a00e88b9092\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089391 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089432 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-conf\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089690 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-sockets\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089776 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcl49\" (UniqueName: \"kubernetes.io/projected/0b72e58a-51a9-49bc-b31e-ce04b0daf651-kube-api-access-dcl49\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089828 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ca06785-e8b8-43ba-919d-2a00e88b9092-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tmtnb\" (UID: \"0ca06785-e8b8-43ba-919d-2a00e88b9092\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089863 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-startup\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.089925 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics-certs\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.103079 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb"] Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.148737 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-9jvll"] Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.149544 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.151530 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vfhgm" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.151663 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.152072 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.155645 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.164806 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-4mcsv"] Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.165727 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.168667 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.187042 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-4mcsv"] Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190793 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-sockets\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190839 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80cdeb47-3878-4100-bf8b-bed7e8df3c74-cert\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190862 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcl49\" (UniqueName: \"kubernetes.io/projected/0b72e58a-51a9-49bc-b31e-ce04b0daf651-kube-api-access-dcl49\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190880 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190901 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ca06785-e8b8-43ba-919d-2a00e88b9092-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tmtnb\" (UID: \"0ca06785-e8b8-43ba-919d-2a00e88b9092\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190919 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-metrics-certs\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190934 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-startup\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190959 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics-certs\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190980 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pnqk\" (UniqueName: \"kubernetes.io/projected/6412088a-587d-4cf6-b85a-087535dc9378-kube-api-access-5pnqk\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.190998 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80cdeb47-3878-4100-bf8b-bed7e8df3c74-metrics-certs\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.191015 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-reloader\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.191082 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rww\" (UniqueName: \"kubernetes.io/projected/80cdeb47-3878-4100-bf8b-bed7e8df3c74-kube-api-access-b5rww\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.191104 4948 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.191186 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ca06785-e8b8-43ba-919d-2a00e88b9092-cert podName:0ca06785-e8b8-43ba-919d-2a00e88b9092 nodeName:}" failed. No retries permitted until 2025-12-04 17:50:38.691162487 +0000 UTC m=+1450.052237089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ca06785-e8b8-43ba-919d-2a00e88b9092-cert") pod "frr-k8s-webhook-server-7fcb986d4-tmtnb" (UID: "0ca06785-e8b8-43ba-919d-2a00e88b9092") : secret "frr-k8s-webhook-server-cert" not found Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.191294 4948 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.191309 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76tvt\" (UniqueName: \"kubernetes.io/projected/0ca06785-e8b8-43ba-919d-2a00e88b9092-kube-api-access-76tvt\") pod \"frr-k8s-webhook-server-7fcb986d4-tmtnb\" (UID: \"0ca06785-e8b8-43ba-919d-2a00e88b9092\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.191407 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics-certs podName:0b72e58a-51a9-49bc-b31e-ce04b0daf651 nodeName:}" failed. No retries permitted until 2025-12-04 17:50:38.691378133 +0000 UTC m=+1450.052452535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics-certs") pod "frr-k8s-7v8rn" (UID: "0b72e58a-51a9-49bc-b31e-ce04b0daf651") : secret "frr-k8s-certs-secret" not found Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.191464 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.191506 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-conf\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.191598 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6412088a-587d-4cf6-b85a-087535dc9378-metallb-excludel2\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.191650 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-reloader\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.191830 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.192140 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-conf\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.192240 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-startup\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.192604 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0b72e58a-51a9-49bc-b31e-ce04b0daf651-frr-sockets\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.248899 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcl49\" (UniqueName: \"kubernetes.io/projected/0b72e58a-51a9-49bc-b31e-ce04b0daf651-kube-api-access-dcl49\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.249418 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76tvt\" (UniqueName: \"kubernetes.io/projected/0ca06785-e8b8-43ba-919d-2a00e88b9092-kube-api-access-76tvt\") pod \"frr-k8s-webhook-server-7fcb986d4-tmtnb\" (UID: \"0ca06785-e8b8-43ba-919d-2a00e88b9092\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.292931 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rww\" (UniqueName: \"kubernetes.io/projected/80cdeb47-3878-4100-bf8b-bed7e8df3c74-kube-api-access-b5rww\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.293019 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6412088a-587d-4cf6-b85a-087535dc9378-metallb-excludel2\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.293058 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80cdeb47-3878-4100-bf8b-bed7e8df3c74-cert\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.293078 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.293112 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-metrics-certs\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.293151 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pnqk\" (UniqueName: \"kubernetes.io/projected/6412088a-587d-4cf6-b85a-087535dc9378-kube-api-access-5pnqk\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.293167 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80cdeb47-3878-4100-bf8b-bed7e8df3c74-metrics-certs\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.293437 4948 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.293563 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist podName:6412088a-587d-4cf6-b85a-087535dc9378 nodeName:}" failed. No retries permitted until 2025-12-04 17:50:38.793546777 +0000 UTC m=+1450.154621179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist") pod "speaker-9jvll" (UID: "6412088a-587d-4cf6-b85a-087535dc9378") : secret "metallb-memberlist" not found Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.293807 4948 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.293898 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-metrics-certs podName:6412088a-587d-4cf6-b85a-087535dc9378 nodeName:}" failed. No retries permitted until 2025-12-04 17:50:38.793869695 +0000 UTC m=+1450.154944097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-metrics-certs") pod "speaker-9jvll" (UID: "6412088a-587d-4cf6-b85a-087535dc9378") : secret "speaker-certs-secret" not found Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.294322 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6412088a-587d-4cf6-b85a-087535dc9378-metallb-excludel2\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.295531 4948 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.297548 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80cdeb47-3878-4100-bf8b-bed7e8df3c74-metrics-certs\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.308416 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80cdeb47-3878-4100-bf8b-bed7e8df3c74-cert\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.311665 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pnqk\" (UniqueName: \"kubernetes.io/projected/6412088a-587d-4cf6-b85a-087535dc9378-kube-api-access-5pnqk\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.316642 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rww\" (UniqueName: \"kubernetes.io/projected/80cdeb47-3878-4100-bf8b-bed7e8df3c74-kube-api-access-b5rww\") pod \"controller-f8648f98b-4mcsv\" (UID: \"80cdeb47-3878-4100-bf8b-bed7e8df3c74\") " pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.477595 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.692683 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-4mcsv"] Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.697869 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ca06785-e8b8-43ba-919d-2a00e88b9092-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tmtnb\" (UID: \"0ca06785-e8b8-43ba-919d-2a00e88b9092\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.697948 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics-certs\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.704190 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b72e58a-51a9-49bc-b31e-ce04b0daf651-metrics-certs\") pod \"frr-k8s-7v8rn\" (UID: \"0b72e58a-51a9-49bc-b31e-ce04b0daf651\") " pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.704687 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ca06785-e8b8-43ba-919d-2a00e88b9092-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-tmtnb\" (UID: \"0ca06785-e8b8-43ba-919d-2a00e88b9092\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.798872 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.799272 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-metrics-certs\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.799200 4948 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 17:50:38 crc kubenswrapper[4948]: E1204 17:50:38.799596 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist podName:6412088a-587d-4cf6-b85a-087535dc9378 nodeName:}" failed. No retries permitted until 2025-12-04 17:50:39.799572833 +0000 UTC m=+1451.160647235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist") pod "speaker-9jvll" (UID: "6412088a-587d-4cf6-b85a-087535dc9378") : secret "metallb-memberlist" not found Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.803865 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-metrics-certs\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.967376 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:38 crc kubenswrapper[4948]: I1204 17:50:38.979380 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.149533 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerStarted","Data":"5471cf242147c1f7606e69584ab3308db9849becdea539d8febdd21acccf8906"} Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.151355 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-4mcsv" event={"ID":"80cdeb47-3878-4100-bf8b-bed7e8df3c74","Type":"ContainerStarted","Data":"2b9797c61b84d0cee65de41da39b0ee49fb008140bd44b53681b05d6398dab79"} Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.151380 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-4mcsv" event={"ID":"80cdeb47-3878-4100-bf8b-bed7e8df3c74","Type":"ContainerStarted","Data":"81de64d752f8ac29f789c9f66c05c884adfa1d202c8449e68643d8ff82c4115e"} Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.151391 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-4mcsv" event={"ID":"80cdeb47-3878-4100-bf8b-bed7e8df3c74","Type":"ContainerStarted","Data":"72aecb69540231e3c37612ab8b28897168814b07adc998fc3ce1eaa2b42270c7"} Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.151529 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.177539 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-4mcsv" podStartSLOduration=1.177519779 podStartE2EDuration="1.177519779s" podCreationTimestamp="2025-12-04 17:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:50:39.168660874 +0000 UTC m=+1450.529735286" watchObservedRunningTime="2025-12-04 17:50:39.177519779 +0000 UTC m=+1450.538594181" Dec 04 17:50:39 crc kubenswrapper[4948]: W1204 17:50:39.401001 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ca06785_e8b8_43ba_919d_2a00e88b9092.slice/crio-61a639f8b6fc4fff1d982ceee6bf19d15d210c71cbc66c2162a0fd58c6231333 WatchSource:0}: Error finding container 61a639f8b6fc4fff1d982ceee6bf19d15d210c71cbc66c2162a0fd58c6231333: Status 404 returned error can't find the container with id 61a639f8b6fc4fff1d982ceee6bf19d15d210c71cbc66c2162a0fd58c6231333 Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.402988 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb"] Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.814060 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.825858 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6412088a-587d-4cf6-b85a-087535dc9378-memberlist\") pod \"speaker-9jvll\" (UID: \"6412088a-587d-4cf6-b85a-087535dc9378\") " pod="metallb-system/speaker-9jvll" Dec 04 17:50:39 crc kubenswrapper[4948]: I1204 17:50:39.965246 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-9jvll" Dec 04 17:50:39 crc kubenswrapper[4948]: W1204 17:50:39.988518 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6412088a_587d_4cf6_b85a_087535dc9378.slice/crio-eaafbbdc3e125c3aac6822aac3ce7aa7cb929c0c114e1af1b37d47298f4f0a7b WatchSource:0}: Error finding container eaafbbdc3e125c3aac6822aac3ce7aa7cb929c0c114e1af1b37d47298f4f0a7b: Status 404 returned error can't find the container with id eaafbbdc3e125c3aac6822aac3ce7aa7cb929c0c114e1af1b37d47298f4f0a7b Dec 04 17:50:40 crc kubenswrapper[4948]: I1204 17:50:40.158745 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9jvll" event={"ID":"6412088a-587d-4cf6-b85a-087535dc9378","Type":"ContainerStarted","Data":"eaafbbdc3e125c3aac6822aac3ce7aa7cb929c0c114e1af1b37d47298f4f0a7b"} Dec 04 17:50:40 crc kubenswrapper[4948]: I1204 17:50:40.159757 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" event={"ID":"0ca06785-e8b8-43ba-919d-2a00e88b9092","Type":"ContainerStarted","Data":"61a639f8b6fc4fff1d982ceee6bf19d15d210c71cbc66c2162a0fd58c6231333"} Dec 04 17:50:41 crc kubenswrapper[4948]: I1204 17:50:41.167089 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9jvll" event={"ID":"6412088a-587d-4cf6-b85a-087535dc9378","Type":"ContainerStarted","Data":"21ab63f9727b7354efd748daddc3bba8460c731398a543eb6bdaf11bdd753817"} Dec 04 17:50:41 crc kubenswrapper[4948]: I1204 17:50:41.167361 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-9jvll" event={"ID":"6412088a-587d-4cf6-b85a-087535dc9378","Type":"ContainerStarted","Data":"25decc8195969858805a2eceedc7a6b205ffc193743731c976789fff06a9edb2"} Dec 04 17:50:42 crc kubenswrapper[4948]: I1204 17:50:42.172842 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-9jvll" Dec 04 17:50:42 crc kubenswrapper[4948]: I1204 17:50:42.216016 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-9jvll" podStartSLOduration=4.215993098 podStartE2EDuration="4.215993098s" podCreationTimestamp="2025-12-04 17:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:50:42.211236777 +0000 UTC m=+1453.572311179" watchObservedRunningTime="2025-12-04 17:50:42.215993098 +0000 UTC m=+1453.577067500" Dec 04 17:50:47 crc kubenswrapper[4948]: I1204 17:50:47.202088 4948 generic.go:334] "Generic (PLEG): container finished" podID="0b72e58a-51a9-49bc-b31e-ce04b0daf651" containerID="c7f8acaf92cc1c75fb8da171a06747f6f10ae7fb802d6dd29ae685611318317f" exitCode=0 Dec 04 17:50:47 crc kubenswrapper[4948]: I1204 17:50:47.202353 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerDied","Data":"c7f8acaf92cc1c75fb8da171a06747f6f10ae7fb802d6dd29ae685611318317f"} Dec 04 17:50:48 crc kubenswrapper[4948]: I1204 17:50:48.209507 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" event={"ID":"0ca06785-e8b8-43ba-919d-2a00e88b9092","Type":"ContainerStarted","Data":"f5d26517a73da5fc07051f577d6ae0b20b98d2f2697847601c7c11d076950d8b"} Dec 04 17:50:48 crc kubenswrapper[4948]: I1204 17:50:48.209881 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:48 crc kubenswrapper[4948]: I1204 17:50:48.211831 4948 generic.go:334] "Generic (PLEG): container finished" podID="0b72e58a-51a9-49bc-b31e-ce04b0daf651" containerID="3e04af9d7e2e21701d66a2af10b1214f50a3408146f3037f463a221cae08e8a4" exitCode=0 Dec 04 17:50:48 crc kubenswrapper[4948]: I1204 17:50:48.211861 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerDied","Data":"3e04af9d7e2e21701d66a2af10b1214f50a3408146f3037f463a221cae08e8a4"} Dec 04 17:50:48 crc kubenswrapper[4948]: I1204 17:50:48.230411 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" podStartSLOduration=2.152887237 podStartE2EDuration="10.230389438s" podCreationTimestamp="2025-12-04 17:50:38 +0000 UTC" firstStartedPulling="2025-12-04 17:50:39.404810311 +0000 UTC m=+1450.765890713" lastFinishedPulling="2025-12-04 17:50:47.482318512 +0000 UTC m=+1458.843392914" observedRunningTime="2025-12-04 17:50:48.228528046 +0000 UTC m=+1459.589602458" watchObservedRunningTime="2025-12-04 17:50:48.230389438 +0000 UTC m=+1459.591463840" Dec 04 17:50:48 crc kubenswrapper[4948]: I1204 17:50:48.482526 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-4mcsv" Dec 04 17:50:49 crc kubenswrapper[4948]: I1204 17:50:49.219425 4948 generic.go:334] "Generic (PLEG): container finished" podID="0b72e58a-51a9-49bc-b31e-ce04b0daf651" containerID="1633ca86428ad2abb48cbb830155d4b1355b3be0c9a9815c6e644cb5bc3931fc" exitCode=0 Dec 04 17:50:49 crc kubenswrapper[4948]: I1204 17:50:49.220740 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerDied","Data":"1633ca86428ad2abb48cbb830155d4b1355b3be0c9a9815c6e644cb5bc3931fc"} Dec 04 17:50:50 crc kubenswrapper[4948]: I1204 17:50:50.231737 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerStarted","Data":"610d142c82aa9f6c29385549c503221bb7d31c60c735cb6e6864dcdc7ffbf2ee"} Dec 04 17:50:50 crc kubenswrapper[4948]: I1204 17:50:50.232034 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerStarted","Data":"1f42520bc74c523e13c5dfec6318882b993acc33ce40e677004e277da0f89768"} Dec 04 17:50:50 crc kubenswrapper[4948]: I1204 17:50:50.232075 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerStarted","Data":"fc5af8cdcfcf3e13fbc6c0084219cd37f589cec7fbfeef5edb1c112553390591"} Dec 04 17:50:50 crc kubenswrapper[4948]: I1204 17:50:50.232085 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerStarted","Data":"d4f611139d17e79c18faa0a64a1ef1559b79a0318dd0cc7d8ceb89a52b9bdc01"} Dec 04 17:50:50 crc kubenswrapper[4948]: I1204 17:50:50.232093 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerStarted","Data":"3de4ff13fcef2bdb16f2840aedb9adc4d74cdb4ebad37e55dfee2864b2a31e1d"} Dec 04 17:50:51 crc kubenswrapper[4948]: I1204 17:50:51.242662 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7v8rn" event={"ID":"0b72e58a-51a9-49bc-b31e-ce04b0daf651","Type":"ContainerStarted","Data":"dfda43a940cc8237f77005b70d3f921e4d96a60bc414089c6aef667e18c97845"} Dec 04 17:50:51 crc kubenswrapper[4948]: I1204 17:50:51.242839 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:51 crc kubenswrapper[4948]: I1204 17:50:51.275003 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7v8rn" podStartSLOduration=5.873350824 podStartE2EDuration="13.274985246s" podCreationTimestamp="2025-12-04 17:50:38 +0000 UTC" firstStartedPulling="2025-12-04 17:50:39.106974989 +0000 UTC m=+1450.468049391" lastFinishedPulling="2025-12-04 17:50:46.508609411 +0000 UTC m=+1457.869683813" observedRunningTime="2025-12-04 17:50:51.268590979 +0000 UTC m=+1462.629665381" watchObservedRunningTime="2025-12-04 17:50:51.274985246 +0000 UTC m=+1462.636059648" Dec 04 17:50:53 crc kubenswrapper[4948]: I1204 17:50:53.980251 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:54 crc kubenswrapper[4948]: I1204 17:50:54.047074 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:50:58 crc kubenswrapper[4948]: I1204 17:50:58.973455 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-tmtnb" Dec 04 17:50:59 crc kubenswrapper[4948]: I1204 17:50:59.969321 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-9jvll" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.336199 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq"] Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.337890 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.340319 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.340975 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq"] Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.450109 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwcbb\" (UniqueName: \"kubernetes.io/projected/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-kube-api-access-vwcbb\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.450172 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.450221 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.552007 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwcbb\" (UniqueName: \"kubernetes.io/projected/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-kube-api-access-vwcbb\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.552114 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.552159 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.552679 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.552747 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.577999 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwcbb\" (UniqueName: \"kubernetes.io/projected/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-kube-api-access-vwcbb\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:01 crc kubenswrapper[4948]: I1204 17:51:01.659977 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:02 crc kubenswrapper[4948]: I1204 17:51:02.064639 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq"] Dec 04 17:51:02 crc kubenswrapper[4948]: I1204 17:51:02.317148 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" event={"ID":"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd","Type":"ContainerStarted","Data":"608aa17333a84cee87b02e393c78563cf837e1265c545a7d60960d6ffeb574da"} Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.674726 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p5g7s"] Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.677012 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.691235 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p5g7s"] Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.785669 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-catalog-content\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.785997 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcrtv\" (UniqueName: \"kubernetes.io/projected/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-kube-api-access-tcrtv\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.786188 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-utilities\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.887792 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcrtv\" (UniqueName: \"kubernetes.io/projected/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-kube-api-access-tcrtv\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.888130 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-utilities\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.888309 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-catalog-content\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.889189 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-utilities\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.889249 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-catalog-content\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.912563 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcrtv\" (UniqueName: \"kubernetes.io/projected/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-kube-api-access-tcrtv\") pod \"redhat-operators-p5g7s\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:03 crc kubenswrapper[4948]: I1204 17:51:03.997294 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:04 crc kubenswrapper[4948]: I1204 17:51:04.231282 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p5g7s"] Dec 04 17:51:04 crc kubenswrapper[4948]: I1204 17:51:04.332632 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5g7s" event={"ID":"34ed8172-fdcb-4d1f-9d9d-139b67592bdc","Type":"ContainerStarted","Data":"ff5bcf0d550cefbdf0a66d7cc2843b85d93ff68a23b93601cbd53756775b166c"} Dec 04 17:51:04 crc kubenswrapper[4948]: I1204 17:51:04.334687 4948 generic.go:334] "Generic (PLEG): container finished" podID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerID="1ebcc7ea97ea5a62004c620d424b72175d025ed23befc4e3a6ee9fd7da204da7" exitCode=0 Dec 04 17:51:04 crc kubenswrapper[4948]: I1204 17:51:04.334723 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" event={"ID":"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd","Type":"ContainerDied","Data":"1ebcc7ea97ea5a62004c620d424b72175d025ed23befc4e3a6ee9fd7da204da7"} Dec 04 17:51:05 crc kubenswrapper[4948]: I1204 17:51:05.348573 4948 generic.go:334] "Generic (PLEG): container finished" podID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerID="8e710139c6cd9dc85b5cf07fc0ad78589482e5bc90405bc4a37d6f28e1142f95" exitCode=0 Dec 04 17:51:05 crc kubenswrapper[4948]: I1204 17:51:05.348623 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5g7s" event={"ID":"34ed8172-fdcb-4d1f-9d9d-139b67592bdc","Type":"ContainerDied","Data":"8e710139c6cd9dc85b5cf07fc0ad78589482e5bc90405bc4a37d6f28e1142f95"} Dec 04 17:51:08 crc kubenswrapper[4948]: I1204 17:51:08.368249 4948 generic.go:334] "Generic (PLEG): container finished" podID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerID="f51bc04c2961ee7185ab09acbcc408f1ba2580e81248e9464f5bcc5f7d183e5e" exitCode=0 Dec 04 17:51:08 crc kubenswrapper[4948]: I1204 17:51:08.368379 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5g7s" event={"ID":"34ed8172-fdcb-4d1f-9d9d-139b67592bdc","Type":"ContainerDied","Data":"f51bc04c2961ee7185ab09acbcc408f1ba2580e81248e9464f5bcc5f7d183e5e"} Dec 04 17:51:08 crc kubenswrapper[4948]: I1204 17:51:08.373462 4948 generic.go:334] "Generic (PLEG): container finished" podID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerID="43af50465bf5f857db29685c6e09c7b1d777ff6b77b008647e3caa48062e5eeb" exitCode=0 Dec 04 17:51:08 crc kubenswrapper[4948]: I1204 17:51:08.373566 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" event={"ID":"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd","Type":"ContainerDied","Data":"43af50465bf5f857db29685c6e09c7b1d777ff6b77b008647e3caa48062e5eeb"} Dec 04 17:51:08 crc kubenswrapper[4948]: I1204 17:51:08.982369 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7v8rn" Dec 04 17:51:09 crc kubenswrapper[4948]: I1204 17:51:09.393450 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5g7s" event={"ID":"34ed8172-fdcb-4d1f-9d9d-139b67592bdc","Type":"ContainerStarted","Data":"0ba493f1995dcea711d156c1bfce2da50b433b7b0ca078483fbbfc2b782e1117"} Dec 04 17:51:09 crc kubenswrapper[4948]: I1204 17:51:09.396389 4948 generic.go:334] "Generic (PLEG): container finished" podID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerID="2a453753d9486663c032c09848a0446c9b4ffcd37dcf940cba52b981a56fbd33" exitCode=0 Dec 04 17:51:09 crc kubenswrapper[4948]: I1204 17:51:09.396428 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" event={"ID":"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd","Type":"ContainerDied","Data":"2a453753d9486663c032c09848a0446c9b4ffcd37dcf940cba52b981a56fbd33"} Dec 04 17:51:09 crc kubenswrapper[4948]: I1204 17:51:09.414834 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p5g7s" podStartSLOduration=2.923407169 podStartE2EDuration="6.414817317s" podCreationTimestamp="2025-12-04 17:51:03 +0000 UTC" firstStartedPulling="2025-12-04 17:51:05.35071325 +0000 UTC m=+1476.711787652" lastFinishedPulling="2025-12-04 17:51:08.842123388 +0000 UTC m=+1480.203197800" observedRunningTime="2025-12-04 17:51:09.409471369 +0000 UTC m=+1480.770545771" watchObservedRunningTime="2025-12-04 17:51:09.414817317 +0000 UTC m=+1480.775891719" Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.644968 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.674729 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-util\") pod \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.674825 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-bundle\") pod \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.674867 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwcbb\" (UniqueName: \"kubernetes.io/projected/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-kube-api-access-vwcbb\") pod \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\" (UID: \"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd\") " Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.676019 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-bundle" (OuterVolumeSpecName: "bundle") pod "bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" (UID: "bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.682455 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-kube-api-access-vwcbb" (OuterVolumeSpecName: "kube-api-access-vwcbb") pod "bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" (UID: "bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd"). InnerVolumeSpecName "kube-api-access-vwcbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.686599 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-util" (OuterVolumeSpecName: "util") pod "bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" (UID: "bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.775877 4948 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-util\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.775923 4948 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:10 crc kubenswrapper[4948]: I1204 17:51:10.775936 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwcbb\" (UniqueName: \"kubernetes.io/projected/bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd-kube-api-access-vwcbb\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:11 crc kubenswrapper[4948]: I1204 17:51:11.409794 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" event={"ID":"bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd","Type":"ContainerDied","Data":"608aa17333a84cee87b02e393c78563cf837e1265c545a7d60960d6ffeb574da"} Dec 04 17:51:11 crc kubenswrapper[4948]: I1204 17:51:11.409825 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="608aa17333a84cee87b02e393c78563cf837e1265c545a7d60960d6ffeb574da" Dec 04 17:51:11 crc kubenswrapper[4948]: I1204 17:51:11.409851 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq" Dec 04 17:51:13 crc kubenswrapper[4948]: I1204 17:51:13.998406 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:13 crc kubenswrapper[4948]: I1204 17:51:13.998748 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:15 crc kubenswrapper[4948]: I1204 17:51:15.050096 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p5g7s" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="registry-server" probeResult="failure" output=< Dec 04 17:51:15 crc kubenswrapper[4948]: timeout: failed to connect service ":50051" within 1s Dec 04 17:51:15 crc kubenswrapper[4948]: > Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.255186 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5"] Dec 04 17:51:19 crc kubenswrapper[4948]: E1204 17:51:19.255848 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerName="extract" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.255864 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerName="extract" Dec 04 17:51:19 crc kubenswrapper[4948]: E1204 17:51:19.255889 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerName="pull" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.255901 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerName="pull" Dec 04 17:51:19 crc kubenswrapper[4948]: E1204 17:51:19.255921 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerName="util" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.255930 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerName="util" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.256106 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd" containerName="extract" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.256653 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.259549 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.260203 4948 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-c75ln" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.261735 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.274519 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5"] Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.422348 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/befdfaa6-0042-45e8-aa69-d87a907d3e5b-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-wxwg5\" (UID: \"befdfaa6-0042-45e8-aa69-d87a907d3e5b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.422402 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r849g\" (UniqueName: \"kubernetes.io/projected/befdfaa6-0042-45e8-aa69-d87a907d3e5b-kube-api-access-r849g\") pod \"cert-manager-operator-controller-manager-64cf6dff88-wxwg5\" (UID: \"befdfaa6-0042-45e8-aa69-d87a907d3e5b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.523268 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/befdfaa6-0042-45e8-aa69-d87a907d3e5b-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-wxwg5\" (UID: \"befdfaa6-0042-45e8-aa69-d87a907d3e5b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.523326 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r849g\" (UniqueName: \"kubernetes.io/projected/befdfaa6-0042-45e8-aa69-d87a907d3e5b-kube-api-access-r849g\") pod \"cert-manager-operator-controller-manager-64cf6dff88-wxwg5\" (UID: \"befdfaa6-0042-45e8-aa69-d87a907d3e5b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.523767 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/befdfaa6-0042-45e8-aa69-d87a907d3e5b-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-wxwg5\" (UID: \"befdfaa6-0042-45e8-aa69-d87a907d3e5b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.560027 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r849g\" (UniqueName: \"kubernetes.io/projected/befdfaa6-0042-45e8-aa69-d87a907d3e5b-kube-api-access-r849g\") pod \"cert-manager-operator-controller-manager-64cf6dff88-wxwg5\" (UID: \"befdfaa6-0042-45e8-aa69-d87a907d3e5b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" Dec 04 17:51:19 crc kubenswrapper[4948]: I1204 17:51:19.576225 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" Dec 04 17:51:20 crc kubenswrapper[4948]: I1204 17:51:19.994733 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5"] Dec 04 17:51:20 crc kubenswrapper[4948]: W1204 17:51:19.999088 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbefdfaa6_0042_45e8_aa69_d87a907d3e5b.slice/crio-736f497399eb911b897addc7a5adc281cb7637daa7bae6bf90b8115b6c809d10 WatchSource:0}: Error finding container 736f497399eb911b897addc7a5adc281cb7637daa7bae6bf90b8115b6c809d10: Status 404 returned error can't find the container with id 736f497399eb911b897addc7a5adc281cb7637daa7bae6bf90b8115b6c809d10 Dec 04 17:51:20 crc kubenswrapper[4948]: I1204 17:51:20.469123 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" event={"ID":"befdfaa6-0042-45e8-aa69-d87a907d3e5b","Type":"ContainerStarted","Data":"736f497399eb911b897addc7a5adc281cb7637daa7bae6bf90b8115b6c809d10"} Dec 04 17:51:24 crc kubenswrapper[4948]: I1204 17:51:24.042629 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:24 crc kubenswrapper[4948]: I1204 17:51:24.089290 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:26 crc kubenswrapper[4948]: I1204 17:51:26.335558 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p5g7s"] Dec 04 17:51:26 crc kubenswrapper[4948]: I1204 17:51:26.335989 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p5g7s" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="registry-server" containerID="cri-o://0ba493f1995dcea711d156c1bfce2da50b433b7b0ca078483fbbfc2b782e1117" gracePeriod=2 Dec 04 17:51:26 crc kubenswrapper[4948]: I1204 17:51:26.514322 4948 generic.go:334] "Generic (PLEG): container finished" podID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerID="0ba493f1995dcea711d156c1bfce2da50b433b7b0ca078483fbbfc2b782e1117" exitCode=0 Dec 04 17:51:26 crc kubenswrapper[4948]: I1204 17:51:26.514427 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5g7s" event={"ID":"34ed8172-fdcb-4d1f-9d9d-139b67592bdc","Type":"ContainerDied","Data":"0ba493f1995dcea711d156c1bfce2da50b433b7b0ca078483fbbfc2b782e1117"} Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.126503 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.269432 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-utilities\") pod \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.269538 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-catalog-content\") pod \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.269599 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcrtv\" (UniqueName: \"kubernetes.io/projected/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-kube-api-access-tcrtv\") pod \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\" (UID: \"34ed8172-fdcb-4d1f-9d9d-139b67592bdc\") " Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.270336 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-utilities" (OuterVolumeSpecName: "utilities") pod "34ed8172-fdcb-4d1f-9d9d-139b67592bdc" (UID: "34ed8172-fdcb-4d1f-9d9d-139b67592bdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.275394 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-kube-api-access-tcrtv" (OuterVolumeSpecName: "kube-api-access-tcrtv") pod "34ed8172-fdcb-4d1f-9d9d-139b67592bdc" (UID: "34ed8172-fdcb-4d1f-9d9d-139b67592bdc"). InnerVolumeSpecName "kube-api-access-tcrtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.371894 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcrtv\" (UniqueName: \"kubernetes.io/projected/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-kube-api-access-tcrtv\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.371961 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.375854 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34ed8172-fdcb-4d1f-9d9d-139b67592bdc" (UID: "34ed8172-fdcb-4d1f-9d9d-139b67592bdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.473333 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34ed8172-fdcb-4d1f-9d9d-139b67592bdc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.535233 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5g7s" event={"ID":"34ed8172-fdcb-4d1f-9d9d-139b67592bdc","Type":"ContainerDied","Data":"ff5bcf0d550cefbdf0a66d7cc2843b85d93ff68a23b93601cbd53756775b166c"} Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.535275 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5g7s" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.535291 4948 scope.go:117] "RemoveContainer" containerID="0ba493f1995dcea711d156c1bfce2da50b433b7b0ca078483fbbfc2b782e1117" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.537421 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" event={"ID":"befdfaa6-0042-45e8-aa69-d87a907d3e5b","Type":"ContainerStarted","Data":"e80e94d4772e676987bc37fd7c1634b7e68aa43da3c3c8dfdb597947a145d771"} Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.566523 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-wxwg5" podStartSLOduration=1.6302808629999999 podStartE2EDuration="10.566501385s" podCreationTimestamp="2025-12-04 17:51:19 +0000 UTC" firstStartedPulling="2025-12-04 17:51:20.00225368 +0000 UTC m=+1491.363328082" lastFinishedPulling="2025-12-04 17:51:28.938474192 +0000 UTC m=+1500.299548604" observedRunningTime="2025-12-04 17:51:29.565960888 +0000 UTC m=+1500.927035290" watchObservedRunningTime="2025-12-04 17:51:29.566501385 +0000 UTC m=+1500.927575787" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.574655 4948 scope.go:117] "RemoveContainer" containerID="f51bc04c2961ee7185ab09acbcc408f1ba2580e81248e9464f5bcc5f7d183e5e" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.592030 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p5g7s"] Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.604414 4948 scope.go:117] "RemoveContainer" containerID="8e710139c6cd9dc85b5cf07fc0ad78589482e5bc90405bc4a37d6f28e1142f95" Dec 04 17:51:29 crc kubenswrapper[4948]: I1204 17:51:29.610481 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p5g7s"] Dec 04 17:51:29 crc kubenswrapper[4948]: E1204 17:51:29.694508 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34ed8172_fdcb_4d1f_9d9d_139b67592bdc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34ed8172_fdcb_4d1f_9d9d_139b67592bdc.slice/crio-ff5bcf0d550cefbdf0a66d7cc2843b85d93ff68a23b93601cbd53756775b166c\": RecentStats: unable to find data in memory cache]" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.349441 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-srs7g"] Dec 04 17:51:30 crc kubenswrapper[4948]: E1204 17:51:30.350035 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="extract-content" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.350078 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="extract-content" Dec 04 17:51:30 crc kubenswrapper[4948]: E1204 17:51:30.350104 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="extract-utilities" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.350114 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="extract-utilities" Dec 04 17:51:30 crc kubenswrapper[4948]: E1204 17:51:30.350136 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="registry-server" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.350147 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="registry-server" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.350327 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" containerName="registry-server" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.351621 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.363669 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srs7g"] Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.489800 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-catalog-content\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.490118 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9879t\" (UniqueName: \"kubernetes.io/projected/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-kube-api-access-9879t\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.490277 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-utilities\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.591760 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-utilities\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.591820 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-catalog-content\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.591881 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9879t\" (UniqueName: \"kubernetes.io/projected/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-kube-api-access-9879t\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.592696 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-utilities\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.592934 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-catalog-content\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.613382 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9879t\" (UniqueName: \"kubernetes.io/projected/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-kube-api-access-9879t\") pod \"community-operators-srs7g\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.670729 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:30 crc kubenswrapper[4948]: I1204 17:51:30.935609 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ed8172-fdcb-4d1f-9d9d-139b67592bdc" path="/var/lib/kubelet/pods/34ed8172-fdcb-4d1f-9d9d-139b67592bdc/volumes" Dec 04 17:51:31 crc kubenswrapper[4948]: I1204 17:51:31.069510 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srs7g"] Dec 04 17:51:31 crc kubenswrapper[4948]: I1204 17:51:31.548928 4948 generic.go:334] "Generic (PLEG): container finished" podID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerID="a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b" exitCode=0 Dec 04 17:51:31 crc kubenswrapper[4948]: I1204 17:51:31.549022 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srs7g" event={"ID":"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56","Type":"ContainerDied","Data":"a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b"} Dec 04 17:51:31 crc kubenswrapper[4948]: I1204 17:51:31.549282 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srs7g" event={"ID":"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56","Type":"ContainerStarted","Data":"00181597b78aca7b6770a747c0a9e752dc47b2c27bfb67a898bc8e48debce838"} Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.561650 4948 generic.go:334] "Generic (PLEG): container finished" podID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerID="9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c" exitCode=0 Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.561754 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srs7g" event={"ID":"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56","Type":"ContainerDied","Data":"9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c"} Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.785829 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rl477"] Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.786774 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.788917 4948 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jft8q" Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.789476 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.792515 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.811187 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rl477"] Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.933076 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a09c31-f983-44e5-8454-39df52726e91-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rl477\" (UID: \"98a09c31-f983-44e5-8454-39df52726e91\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:33 crc kubenswrapper[4948]: I1204 17:51:33.933444 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgvb\" (UniqueName: \"kubernetes.io/projected/98a09c31-f983-44e5-8454-39df52726e91-kube-api-access-csgvb\") pod \"cert-manager-webhook-f4fb5df64-rl477\" (UID: \"98a09c31-f983-44e5-8454-39df52726e91\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.035182 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgvb\" (UniqueName: \"kubernetes.io/projected/98a09c31-f983-44e5-8454-39df52726e91-kube-api-access-csgvb\") pod \"cert-manager-webhook-f4fb5df64-rl477\" (UID: \"98a09c31-f983-44e5-8454-39df52726e91\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.035272 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a09c31-f983-44e5-8454-39df52726e91-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rl477\" (UID: \"98a09c31-f983-44e5-8454-39df52726e91\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.053744 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98a09c31-f983-44e5-8454-39df52726e91-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rl477\" (UID: \"98a09c31-f983-44e5-8454-39df52726e91\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.056737 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgvb\" (UniqueName: \"kubernetes.io/projected/98a09c31-f983-44e5-8454-39df52726e91-kube-api-access-csgvb\") pod \"cert-manager-webhook-f4fb5df64-rl477\" (UID: \"98a09c31-f983-44e5-8454-39df52726e91\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.107318 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.353276 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rl477"] Dec 04 17:51:34 crc kubenswrapper[4948]: W1204 17:51:34.361940 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98a09c31_f983_44e5_8454_39df52726e91.slice/crio-7ad307d3974a66f5e9378bcb749ce960d553c160d1b859437c409b82c36175e8 WatchSource:0}: Error finding container 7ad307d3974a66f5e9378bcb749ce960d553c160d1b859437c409b82c36175e8: Status 404 returned error can't find the container with id 7ad307d3974a66f5e9378bcb749ce960d553c160d1b859437c409b82c36175e8 Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.569076 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srs7g" event={"ID":"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56","Type":"ContainerStarted","Data":"2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9"} Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.570681 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" event={"ID":"98a09c31-f983-44e5-8454-39df52726e91","Type":"ContainerStarted","Data":"7ad307d3974a66f5e9378bcb749ce960d553c160d1b859437c409b82c36175e8"} Dec 04 17:51:34 crc kubenswrapper[4948]: I1204 17:51:34.589689 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-srs7g" podStartSLOduration=2.181548153 podStartE2EDuration="4.589672356s" podCreationTimestamp="2025-12-04 17:51:30 +0000 UTC" firstStartedPulling="2025-12-04 17:51:31.55048354 +0000 UTC m=+1502.911557942" lastFinishedPulling="2025-12-04 17:51:33.958607743 +0000 UTC m=+1505.319682145" observedRunningTime="2025-12-04 17:51:34.588557669 +0000 UTC m=+1505.949632071" watchObservedRunningTime="2025-12-04 17:51:34.589672356 +0000 UTC m=+1505.950746758" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.143357 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z"] Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.144614 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.147582 4948 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bqxjq" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.161087 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z"] Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.297925 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1085add4-ec56-476e-8816-81284e3676e4-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-xwz2z\" (UID: \"1085add4-ec56-476e-8816-81284e3676e4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.298105 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2gbk\" (UniqueName: \"kubernetes.io/projected/1085add4-ec56-476e-8816-81284e3676e4-kube-api-access-z2gbk\") pod \"cert-manager-cainjector-855d9ccff4-xwz2z\" (UID: \"1085add4-ec56-476e-8816-81284e3676e4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.399219 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gbk\" (UniqueName: \"kubernetes.io/projected/1085add4-ec56-476e-8816-81284e3676e4-kube-api-access-z2gbk\") pod \"cert-manager-cainjector-855d9ccff4-xwz2z\" (UID: \"1085add4-ec56-476e-8816-81284e3676e4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.399545 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1085add4-ec56-476e-8816-81284e3676e4-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-xwz2z\" (UID: \"1085add4-ec56-476e-8816-81284e3676e4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.425079 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2gbk\" (UniqueName: \"kubernetes.io/projected/1085add4-ec56-476e-8816-81284e3676e4-kube-api-access-z2gbk\") pod \"cert-manager-cainjector-855d9ccff4-xwz2z\" (UID: \"1085add4-ec56-476e-8816-81284e3676e4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.425450 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1085add4-ec56-476e-8816-81284e3676e4-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-xwz2z\" (UID: \"1085add4-ec56-476e-8816-81284e3676e4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.470182 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" Dec 04 17:51:38 crc kubenswrapper[4948]: I1204 17:51:38.923079 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z"] Dec 04 17:51:40 crc kubenswrapper[4948]: I1204 17:51:40.625268 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:51:40 crc kubenswrapper[4948]: I1204 17:51:40.625605 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:51:40 crc kubenswrapper[4948]: I1204 17:51:40.672055 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:40 crc kubenswrapper[4948]: I1204 17:51:40.672102 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:40 crc kubenswrapper[4948]: I1204 17:51:40.718298 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:41 crc kubenswrapper[4948]: I1204 17:51:41.669789 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:43 crc kubenswrapper[4948]: I1204 17:51:43.143414 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srs7g"] Dec 04 17:51:43 crc kubenswrapper[4948]: I1204 17:51:43.666259 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-srs7g" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerName="registry-server" containerID="cri-o://2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9" gracePeriod=2 Dec 04 17:51:43 crc kubenswrapper[4948]: W1204 17:51:43.766370 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1085add4_ec56_476e_8816_81284e3676e4.slice/crio-3a5e924f6a1f7b7a1b49ba1f9bf9fea0ebd9cf86f7ec6b929c3d805c144ca02c WatchSource:0}: Error finding container 3a5e924f6a1f7b7a1b49ba1f9bf9fea0ebd9cf86f7ec6b929c3d805c144ca02c: Status 404 returned error can't find the container with id 3a5e924f6a1f7b7a1b49ba1f9bf9fea0ebd9cf86f7ec6b929c3d805c144ca02c Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.082001 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.182666 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-utilities\") pod \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.182760 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9879t\" (UniqueName: \"kubernetes.io/projected/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-kube-api-access-9879t\") pod \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.183548 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-utilities" (OuterVolumeSpecName: "utilities") pod "e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" (UID: "e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.184404 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-catalog-content\") pod \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\" (UID: \"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56\") " Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.184801 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.188942 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-kube-api-access-9879t" (OuterVolumeSpecName: "kube-api-access-9879t") pod "e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" (UID: "e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56"). InnerVolumeSpecName "kube-api-access-9879t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.246849 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" (UID: "e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.286231 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9879t\" (UniqueName: \"kubernetes.io/projected/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-kube-api-access-9879t\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.286283 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.674325 4948 generic.go:334] "Generic (PLEG): container finished" podID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerID="2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9" exitCode=0 Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.674368 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srs7g" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.675004 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srs7g" event={"ID":"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56","Type":"ContainerDied","Data":"2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9"} Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.675091 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srs7g" event={"ID":"e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56","Type":"ContainerDied","Data":"00181597b78aca7b6770a747c0a9e752dc47b2c27bfb67a898bc8e48debce838"} Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.675123 4948 scope.go:117] "RemoveContainer" containerID="2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.685766 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" event={"ID":"1085add4-ec56-476e-8816-81284e3676e4","Type":"ContainerStarted","Data":"3a5e924f6a1f7b7a1b49ba1f9bf9fea0ebd9cf86f7ec6b929c3d805c144ca02c"} Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.704995 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srs7g"] Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.708679 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-srs7g"] Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.927105 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" path="/var/lib/kubelet/pods/e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56/volumes" Dec 04 17:51:44 crc kubenswrapper[4948]: I1204 17:51:44.995159 4948 scope.go:117] "RemoveContainer" containerID="9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c" Dec 04 17:51:45 crc kubenswrapper[4948]: I1204 17:51:45.020324 4948 scope.go:117] "RemoveContainer" containerID="a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b" Dec 04 17:51:45 crc kubenswrapper[4948]: I1204 17:51:45.040649 4948 scope.go:117] "RemoveContainer" containerID="2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9" Dec 04 17:51:45 crc kubenswrapper[4948]: E1204 17:51:45.042658 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9\": container with ID starting with 2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9 not found: ID does not exist" containerID="2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9" Dec 04 17:51:45 crc kubenswrapper[4948]: I1204 17:51:45.042705 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9"} err="failed to get container status \"2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9\": rpc error: code = NotFound desc = could not find container \"2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9\": container with ID starting with 2ce9651fab0dd1c944adcfe15f8bbc33ed2d237c208f47705def6c9b134936f9 not found: ID does not exist" Dec 04 17:51:45 crc kubenswrapper[4948]: I1204 17:51:45.042739 4948 scope.go:117] "RemoveContainer" containerID="9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c" Dec 04 17:51:45 crc kubenswrapper[4948]: E1204 17:51:45.043190 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c\": container with ID starting with 9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c not found: ID does not exist" containerID="9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c" Dec 04 17:51:45 crc kubenswrapper[4948]: I1204 17:51:45.043227 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c"} err="failed to get container status \"9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c\": rpc error: code = NotFound desc = could not find container \"9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c\": container with ID starting with 9e94b7fe81d712f48697f02594b61ddf41667de54757aa4f99b4195229f8866c not found: ID does not exist" Dec 04 17:51:45 crc kubenswrapper[4948]: I1204 17:51:45.043252 4948 scope.go:117] "RemoveContainer" containerID="a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b" Dec 04 17:51:45 crc kubenswrapper[4948]: E1204 17:51:45.043620 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b\": container with ID starting with a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b not found: ID does not exist" containerID="a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b" Dec 04 17:51:45 crc kubenswrapper[4948]: I1204 17:51:45.043652 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b"} err="failed to get container status \"a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b\": rpc error: code = NotFound desc = could not find container \"a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b\": container with ID starting with a27bfff93272f9f50844c43c9965765f07573ceaeab35aa88fe9ccb06ba76c0b not found: ID does not exist" Dec 04 17:51:46 crc kubenswrapper[4948]: I1204 17:51:46.709648 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" event={"ID":"1085add4-ec56-476e-8816-81284e3676e4","Type":"ContainerStarted","Data":"f5e97850385fa30cbc12003872aafa0f7c50ba0302710828c15d57972f0a84a1"} Dec 04 17:51:46 crc kubenswrapper[4948]: I1204 17:51:46.712501 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" event={"ID":"98a09c31-f983-44e5-8454-39df52726e91","Type":"ContainerStarted","Data":"754f7ec8cab7af2c5589a8c3aa3a6e96193594b5cbde1412c36800d50a49989c"} Dec 04 17:51:46 crc kubenswrapper[4948]: I1204 17:51:46.712662 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:46 crc kubenswrapper[4948]: I1204 17:51:46.725964 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-xwz2z" podStartSLOduration=6.091655069 podStartE2EDuration="8.725945567s" podCreationTimestamp="2025-12-04 17:51:38 +0000 UTC" firstStartedPulling="2025-12-04 17:51:43.775389318 +0000 UTC m=+1515.136463720" lastFinishedPulling="2025-12-04 17:51:46.409679816 +0000 UTC m=+1517.770754218" observedRunningTime="2025-12-04 17:51:46.722761632 +0000 UTC m=+1518.083836054" watchObservedRunningTime="2025-12-04 17:51:46.725945567 +0000 UTC m=+1518.087019969" Dec 04 17:51:46 crc kubenswrapper[4948]: I1204 17:51:46.741416 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" podStartSLOduration=4.289272165 podStartE2EDuration="13.741401553s" podCreationTimestamp="2025-12-04 17:51:33 +0000 UTC" firstStartedPulling="2025-12-04 17:51:34.363988255 +0000 UTC m=+1505.725062657" lastFinishedPulling="2025-12-04 17:51:43.816117643 +0000 UTC m=+1515.177192045" observedRunningTime="2025-12-04 17:51:46.738446427 +0000 UTC m=+1518.099520839" watchObservedRunningTime="2025-12-04 17:51:46.741401553 +0000 UTC m=+1518.102475955" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.576216 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-n2t4w"] Dec 04 17:51:51 crc kubenswrapper[4948]: E1204 17:51:51.576943 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerName="extract-content" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.576969 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerName="extract-content" Dec 04 17:51:51 crc kubenswrapper[4948]: E1204 17:51:51.576983 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerName="registry-server" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.576995 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerName="registry-server" Dec 04 17:51:51 crc kubenswrapper[4948]: E1204 17:51:51.577071 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerName="extract-utilities" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.577088 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerName="extract-utilities" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.577260 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e04a6aaa-42d6-4980-a2bb-f7be8aaf9e56" containerName="registry-server" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.577880 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-n2t4w" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.587785 4948 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-9fvnc" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.599319 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-n2t4w"] Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.681912 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e507144d-9fd7-420e-881a-99aee0044dd4-bound-sa-token\") pod \"cert-manager-86cb77c54b-n2t4w\" (UID: \"e507144d-9fd7-420e-881a-99aee0044dd4\") " pod="cert-manager/cert-manager-86cb77c54b-n2t4w" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.681975 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6bxc\" (UniqueName: \"kubernetes.io/projected/e507144d-9fd7-420e-881a-99aee0044dd4-kube-api-access-l6bxc\") pod \"cert-manager-86cb77c54b-n2t4w\" (UID: \"e507144d-9fd7-420e-881a-99aee0044dd4\") " pod="cert-manager/cert-manager-86cb77c54b-n2t4w" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.783528 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e507144d-9fd7-420e-881a-99aee0044dd4-bound-sa-token\") pod \"cert-manager-86cb77c54b-n2t4w\" (UID: \"e507144d-9fd7-420e-881a-99aee0044dd4\") " pod="cert-manager/cert-manager-86cb77c54b-n2t4w" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.783577 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6bxc\" (UniqueName: \"kubernetes.io/projected/e507144d-9fd7-420e-881a-99aee0044dd4-kube-api-access-l6bxc\") pod \"cert-manager-86cb77c54b-n2t4w\" (UID: \"e507144d-9fd7-420e-881a-99aee0044dd4\") " pod="cert-manager/cert-manager-86cb77c54b-n2t4w" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.812365 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e507144d-9fd7-420e-881a-99aee0044dd4-bound-sa-token\") pod \"cert-manager-86cb77c54b-n2t4w\" (UID: \"e507144d-9fd7-420e-881a-99aee0044dd4\") " pod="cert-manager/cert-manager-86cb77c54b-n2t4w" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.812694 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6bxc\" (UniqueName: \"kubernetes.io/projected/e507144d-9fd7-420e-881a-99aee0044dd4-kube-api-access-l6bxc\") pod \"cert-manager-86cb77c54b-n2t4w\" (UID: \"e507144d-9fd7-420e-881a-99aee0044dd4\") " pod="cert-manager/cert-manager-86cb77c54b-n2t4w" Dec 04 17:51:51 crc kubenswrapper[4948]: I1204 17:51:51.897118 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-n2t4w" Dec 04 17:51:52 crc kubenswrapper[4948]: I1204 17:51:52.364518 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-n2t4w"] Dec 04 17:51:52 crc kubenswrapper[4948]: I1204 17:51:52.749315 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-n2t4w" event={"ID":"e507144d-9fd7-420e-881a-99aee0044dd4","Type":"ContainerStarted","Data":"3fded387330a49a6cb2d2f542f3b9b9df5f0865131188de4959b0dafc16daa3f"} Dec 04 17:51:52 crc kubenswrapper[4948]: I1204 17:51:52.749372 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-n2t4w" event={"ID":"e507144d-9fd7-420e-881a-99aee0044dd4","Type":"ContainerStarted","Data":"8ce28bb9c56174085a768e31279c52d21c30f7e199f1b13454cbe9e1ab71cec5"} Dec 04 17:51:52 crc kubenswrapper[4948]: I1204 17:51:52.769664 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-n2t4w" podStartSLOduration=1.76964283 podStartE2EDuration="1.76964283s" podCreationTimestamp="2025-12-04 17:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:51:52.76781675 +0000 UTC m=+1524.128891162" watchObservedRunningTime="2025-12-04 17:51:52.76964283 +0000 UTC m=+1524.130717232" Dec 04 17:51:54 crc kubenswrapper[4948]: I1204 17:51:54.110978 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-rl477" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.201363 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ph295"] Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.202536 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ph295" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.215374 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ph295"] Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.223099 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-kz6tv" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.223130 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.223204 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.372969 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rshmd\" (UniqueName: \"kubernetes.io/projected/0cb5e1fc-2fbb-462a-98ee-8873063781ee-kube-api-access-rshmd\") pod \"openstack-operator-index-ph295\" (UID: \"0cb5e1fc-2fbb-462a-98ee-8873063781ee\") " pod="openstack-operators/openstack-operator-index-ph295" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.474691 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rshmd\" (UniqueName: \"kubernetes.io/projected/0cb5e1fc-2fbb-462a-98ee-8873063781ee-kube-api-access-rshmd\") pod \"openstack-operator-index-ph295\" (UID: \"0cb5e1fc-2fbb-462a-98ee-8873063781ee\") " pod="openstack-operators/openstack-operator-index-ph295" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.504944 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rshmd\" (UniqueName: \"kubernetes.io/projected/0cb5e1fc-2fbb-462a-98ee-8873063781ee-kube-api-access-rshmd\") pod \"openstack-operator-index-ph295\" (UID: \"0cb5e1fc-2fbb-462a-98ee-8873063781ee\") " pod="openstack-operators/openstack-operator-index-ph295" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.528502 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ph295" Dec 04 17:51:57 crc kubenswrapper[4948]: I1204 17:51:57.867082 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ph295"] Dec 04 17:51:57 crc kubenswrapper[4948]: W1204 17:51:57.875532 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb5e1fc_2fbb_462a_98ee_8873063781ee.slice/crio-0fa1b0c4a803655c795d38a342936652398af869d6907936035b6b666c1c5e9f WatchSource:0}: Error finding container 0fa1b0c4a803655c795d38a342936652398af869d6907936035b6b666c1c5e9f: Status 404 returned error can't find the container with id 0fa1b0c4a803655c795d38a342936652398af869d6907936035b6b666c1c5e9f Dec 04 17:51:58 crc kubenswrapper[4948]: I1204 17:51:58.795879 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ph295" event={"ID":"0cb5e1fc-2fbb-462a-98ee-8873063781ee","Type":"ContainerStarted","Data":"0fa1b0c4a803655c795d38a342936652398af869d6907936035b6b666c1c5e9f"} Dec 04 17:52:00 crc kubenswrapper[4948]: I1204 17:52:00.569837 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ph295"] Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.378678 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-27h4f"] Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.381892 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.390110 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-27h4f"] Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.532847 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhkz\" (UniqueName: \"kubernetes.io/projected/6b7d9d62-89d9-47b1-8757-7c6da0fe06db-kube-api-access-hrhkz\") pod \"openstack-operator-index-27h4f\" (UID: \"6b7d9d62-89d9-47b1-8757-7c6da0fe06db\") " pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.634783 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhkz\" (UniqueName: \"kubernetes.io/projected/6b7d9d62-89d9-47b1-8757-7c6da0fe06db-kube-api-access-hrhkz\") pod \"openstack-operator-index-27h4f\" (UID: \"6b7d9d62-89d9-47b1-8757-7c6da0fe06db\") " pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.657733 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhkz\" (UniqueName: \"kubernetes.io/projected/6b7d9d62-89d9-47b1-8757-7c6da0fe06db-kube-api-access-hrhkz\") pod \"openstack-operator-index-27h4f\" (UID: \"6b7d9d62-89d9-47b1-8757-7c6da0fe06db\") " pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.706376 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.815909 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ph295" event={"ID":"0cb5e1fc-2fbb-462a-98ee-8873063781ee","Type":"ContainerStarted","Data":"c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693"} Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.816069 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ph295" podUID="0cb5e1fc-2fbb-462a-98ee-8873063781ee" containerName="registry-server" containerID="cri-o://c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693" gracePeriod=2 Dec 04 17:52:01 crc kubenswrapper[4948]: I1204 17:52:01.874008 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ph295" podStartSLOduration=1.328831438 podStartE2EDuration="4.873986244s" podCreationTimestamp="2025-12-04 17:51:57 +0000 UTC" firstStartedPulling="2025-12-04 17:51:57.877271731 +0000 UTC m=+1529.238346133" lastFinishedPulling="2025-12-04 17:52:01.422426537 +0000 UTC m=+1532.783500939" observedRunningTime="2025-12-04 17:52:01.866444627 +0000 UTC m=+1533.227519029" watchObservedRunningTime="2025-12-04 17:52:01.873986244 +0000 UTC m=+1533.235060646" Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.184226 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-27h4f"] Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.714240 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ph295" Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.822976 4948 generic.go:334] "Generic (PLEG): container finished" podID="0cb5e1fc-2fbb-462a-98ee-8873063781ee" containerID="c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693" exitCode=0 Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.823071 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ph295" event={"ID":"0cb5e1fc-2fbb-462a-98ee-8873063781ee","Type":"ContainerDied","Data":"c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693"} Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.823103 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ph295" Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.823132 4948 scope.go:117] "RemoveContainer" containerID="c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693" Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.823118 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ph295" event={"ID":"0cb5e1fc-2fbb-462a-98ee-8873063781ee","Type":"ContainerDied","Data":"0fa1b0c4a803655c795d38a342936652398af869d6907936035b6b666c1c5e9f"} Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.824774 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-27h4f" event={"ID":"6b7d9d62-89d9-47b1-8757-7c6da0fe06db","Type":"ContainerStarted","Data":"6648f61117384f42ad0699aa6d920abbb3a6d37d7ef1058dec3449a8c6908b60"} Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.824797 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-27h4f" event={"ID":"6b7d9d62-89d9-47b1-8757-7c6da0fe06db","Type":"ContainerStarted","Data":"0cba4443c42318d06f22b3440cb8b6406f7dd702f1c2877e16912b1b4b3a8abd"} Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.841187 4948 scope.go:117] "RemoveContainer" containerID="c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693" Dec 04 17:52:02 crc kubenswrapper[4948]: E1204 17:52:02.841761 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693\": container with ID starting with c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693 not found: ID does not exist" containerID="c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693" Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.841972 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693"} err="failed to get container status \"c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693\": rpc error: code = NotFound desc = could not find container \"c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693\": container with ID starting with c23cd0d06206e68aecda98123055d1760e82ac75cf10f410ac5fd13d3f53c693 not found: ID does not exist" Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.846533 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-27h4f" podStartSLOduration=1.792972988 podStartE2EDuration="1.846515183s" podCreationTimestamp="2025-12-04 17:52:01 +0000 UTC" firstStartedPulling="2025-12-04 17:52:02.192079465 +0000 UTC m=+1533.553153867" lastFinishedPulling="2025-12-04 17:52:02.24562166 +0000 UTC m=+1533.606696062" observedRunningTime="2025-12-04 17:52:02.843425482 +0000 UTC m=+1534.204499954" watchObservedRunningTime="2025-12-04 17:52:02.846515183 +0000 UTC m=+1534.207589585" Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.864571 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rshmd\" (UniqueName: \"kubernetes.io/projected/0cb5e1fc-2fbb-462a-98ee-8873063781ee-kube-api-access-rshmd\") pod \"0cb5e1fc-2fbb-462a-98ee-8873063781ee\" (UID: \"0cb5e1fc-2fbb-462a-98ee-8873063781ee\") " Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.869160 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb5e1fc-2fbb-462a-98ee-8873063781ee-kube-api-access-rshmd" (OuterVolumeSpecName: "kube-api-access-rshmd") pod "0cb5e1fc-2fbb-462a-98ee-8873063781ee" (UID: "0cb5e1fc-2fbb-462a-98ee-8873063781ee"). InnerVolumeSpecName "kube-api-access-rshmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:52:02 crc kubenswrapper[4948]: I1204 17:52:02.966579 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rshmd\" (UniqueName: \"kubernetes.io/projected/0cb5e1fc-2fbb-462a-98ee-8873063781ee-kube-api-access-rshmd\") on node \"crc\" DevicePath \"\"" Dec 04 17:52:03 crc kubenswrapper[4948]: I1204 17:52:03.147721 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ph295"] Dec 04 17:52:03 crc kubenswrapper[4948]: I1204 17:52:03.154995 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-ph295"] Dec 04 17:52:04 crc kubenswrapper[4948]: I1204 17:52:04.928400 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb5e1fc-2fbb-462a-98ee-8873063781ee" path="/var/lib/kubelet/pods/0cb5e1fc-2fbb-462a-98ee-8873063781ee/volumes" Dec 04 17:52:10 crc kubenswrapper[4948]: I1204 17:52:10.625391 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:52:10 crc kubenswrapper[4948]: I1204 17:52:10.625788 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:52:11 crc kubenswrapper[4948]: I1204 17:52:11.707256 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:11 crc kubenswrapper[4948]: I1204 17:52:11.708248 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:11 crc kubenswrapper[4948]: I1204 17:52:11.740071 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:11 crc kubenswrapper[4948]: I1204 17:52:11.993743 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-27h4f" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.665241 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9"] Dec 04 17:52:17 crc kubenswrapper[4948]: E1204 17:52:17.665957 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb5e1fc-2fbb-462a-98ee-8873063781ee" containerName="registry-server" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.665988 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb5e1fc-2fbb-462a-98ee-8873063781ee" containerName="registry-server" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.666521 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb5e1fc-2fbb-462a-98ee-8873063781ee" containerName="registry-server" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.668134 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.679014 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9"] Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.679422 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w5ghv" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.801741 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2bmm\" (UniqueName: \"kubernetes.io/projected/2195ca22-748e-4ba3-9524-891a74e7440e-kube-api-access-k2bmm\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.801789 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-bundle\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.801837 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-util\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.903430 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2bmm\" (UniqueName: \"kubernetes.io/projected/2195ca22-748e-4ba3-9524-891a74e7440e-kube-api-access-k2bmm\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.903710 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-bundle\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.903822 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-util\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.904279 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-util\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.904462 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-bundle\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.923367 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2bmm\" (UniqueName: \"kubernetes.io/projected/2195ca22-748e-4ba3-9524-891a74e7440e-kube-api-access-k2bmm\") pod \"69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:17 crc kubenswrapper[4948]: I1204 17:52:17.999327 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:18 crc kubenswrapper[4948]: I1204 17:52:18.406428 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9"] Dec 04 17:52:18 crc kubenswrapper[4948]: W1204 17:52:18.415467 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2195ca22_748e_4ba3_9524_891a74e7440e.slice/crio-edef61bf2364dc77fefa9e4369e6ae2babe273b7d297f34005ab430a16ba5885 WatchSource:0}: Error finding container edef61bf2364dc77fefa9e4369e6ae2babe273b7d297f34005ab430a16ba5885: Status 404 returned error can't find the container with id edef61bf2364dc77fefa9e4369e6ae2babe273b7d297f34005ab430a16ba5885 Dec 04 17:52:19 crc kubenswrapper[4948]: I1204 17:52:19.010124 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" event={"ID":"2195ca22-748e-4ba3-9524-891a74e7440e","Type":"ContainerStarted","Data":"edef61bf2364dc77fefa9e4369e6ae2babe273b7d297f34005ab430a16ba5885"} Dec 04 17:52:21 crc kubenswrapper[4948]: I1204 17:52:21.025198 4948 generic.go:334] "Generic (PLEG): container finished" podID="2195ca22-748e-4ba3-9524-891a74e7440e" containerID="cbde4439b68074b2f348b74ab9bcf25503d5c56a689fd0332538dbb3a24eb2f3" exitCode=0 Dec 04 17:52:21 crc kubenswrapper[4948]: I1204 17:52:21.025303 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" event={"ID":"2195ca22-748e-4ba3-9524-891a74e7440e","Type":"ContainerDied","Data":"cbde4439b68074b2f348b74ab9bcf25503d5c56a689fd0332538dbb3a24eb2f3"} Dec 04 17:52:22 crc kubenswrapper[4948]: I1204 17:52:22.033466 4948 generic.go:334] "Generic (PLEG): container finished" podID="2195ca22-748e-4ba3-9524-891a74e7440e" containerID="35d63b5c3e3a8174bc155075cc129ef85316b7c12877cda818ab93a07d3fbc0e" exitCode=0 Dec 04 17:52:22 crc kubenswrapper[4948]: I1204 17:52:22.033539 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" event={"ID":"2195ca22-748e-4ba3-9524-891a74e7440e","Type":"ContainerDied","Data":"35d63b5c3e3a8174bc155075cc129ef85316b7c12877cda818ab93a07d3fbc0e"} Dec 04 17:52:23 crc kubenswrapper[4948]: I1204 17:52:23.042071 4948 generic.go:334] "Generic (PLEG): container finished" podID="2195ca22-748e-4ba3-9524-891a74e7440e" containerID="0d8316996189644a921cc267c4e576e204a5133a60d10593e44ebb0c93260e6d" exitCode=0 Dec 04 17:52:23 crc kubenswrapper[4948]: I1204 17:52:23.042175 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" event={"ID":"2195ca22-748e-4ba3-9524-891a74e7440e","Type":"ContainerDied","Data":"0d8316996189644a921cc267c4e576e204a5133a60d10593e44ebb0c93260e6d"} Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.353637 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.530328 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2bmm\" (UniqueName: \"kubernetes.io/projected/2195ca22-748e-4ba3-9524-891a74e7440e-kube-api-access-k2bmm\") pod \"2195ca22-748e-4ba3-9524-891a74e7440e\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.530668 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-bundle\") pod \"2195ca22-748e-4ba3-9524-891a74e7440e\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.530747 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-util\") pod \"2195ca22-748e-4ba3-9524-891a74e7440e\" (UID: \"2195ca22-748e-4ba3-9524-891a74e7440e\") " Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.532641 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-bundle" (OuterVolumeSpecName: "bundle") pod "2195ca22-748e-4ba3-9524-891a74e7440e" (UID: "2195ca22-748e-4ba3-9524-891a74e7440e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.539953 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2195ca22-748e-4ba3-9524-891a74e7440e-kube-api-access-k2bmm" (OuterVolumeSpecName: "kube-api-access-k2bmm") pod "2195ca22-748e-4ba3-9524-891a74e7440e" (UID: "2195ca22-748e-4ba3-9524-891a74e7440e"). InnerVolumeSpecName "kube-api-access-k2bmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.560784 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-util" (OuterVolumeSpecName: "util") pod "2195ca22-748e-4ba3-9524-891a74e7440e" (UID: "2195ca22-748e-4ba3-9524-891a74e7440e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.634115 4948 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-util\") on node \"crc\" DevicePath \"\"" Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.634185 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2bmm\" (UniqueName: \"kubernetes.io/projected/2195ca22-748e-4ba3-9524-891a74e7440e-kube-api-access-k2bmm\") on node \"crc\" DevicePath \"\"" Dec 04 17:52:24 crc kubenswrapper[4948]: I1204 17:52:24.634201 4948 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2195ca22-748e-4ba3-9524-891a74e7440e-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:52:25 crc kubenswrapper[4948]: I1204 17:52:25.061518 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" event={"ID":"2195ca22-748e-4ba3-9524-891a74e7440e","Type":"ContainerDied","Data":"edef61bf2364dc77fefa9e4369e6ae2babe273b7d297f34005ab430a16ba5885"} Dec 04 17:52:25 crc kubenswrapper[4948]: I1204 17:52:25.061563 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edef61bf2364dc77fefa9e4369e6ae2babe273b7d297f34005ab430a16ba5885" Dec 04 17:52:25 crc kubenswrapper[4948]: I1204 17:52:25.061622 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.432474 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5"] Dec 04 17:52:30 crc kubenswrapper[4948]: E1204 17:52:30.433421 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2195ca22-748e-4ba3-9524-891a74e7440e" containerName="pull" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.433441 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="2195ca22-748e-4ba3-9524-891a74e7440e" containerName="pull" Dec 04 17:52:30 crc kubenswrapper[4948]: E1204 17:52:30.433461 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2195ca22-748e-4ba3-9524-891a74e7440e" containerName="extract" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.433473 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="2195ca22-748e-4ba3-9524-891a74e7440e" containerName="extract" Dec 04 17:52:30 crc kubenswrapper[4948]: E1204 17:52:30.433500 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2195ca22-748e-4ba3-9524-891a74e7440e" containerName="util" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.433510 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="2195ca22-748e-4ba3-9524-891a74e7440e" containerName="util" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.433682 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="2195ca22-748e-4ba3-9524-891a74e7440e" containerName="extract" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.435202 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.437431 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-5zgkd" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.471478 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5"] Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.619080 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbwt5\" (UniqueName: \"kubernetes.io/projected/3ec94845-de1c-4ed1-b588-7cbe115fb1d7-kube-api-access-sbwt5\") pod \"openstack-operator-controller-operator-66bcc8f984-mrlv5\" (UID: \"3ec94845-de1c-4ed1-b588-7cbe115fb1d7\") " pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.721322 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbwt5\" (UniqueName: \"kubernetes.io/projected/3ec94845-de1c-4ed1-b588-7cbe115fb1d7-kube-api-access-sbwt5\") pod \"openstack-operator-controller-operator-66bcc8f984-mrlv5\" (UID: \"3ec94845-de1c-4ed1-b588-7cbe115fb1d7\") " pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.752851 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbwt5\" (UniqueName: \"kubernetes.io/projected/3ec94845-de1c-4ed1-b588-7cbe115fb1d7-kube-api-access-sbwt5\") pod \"openstack-operator-controller-operator-66bcc8f984-mrlv5\" (UID: \"3ec94845-de1c-4ed1-b588-7cbe115fb1d7\") " pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" Dec 04 17:52:30 crc kubenswrapper[4948]: I1204 17:52:30.754289 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" Dec 04 17:52:31 crc kubenswrapper[4948]: I1204 17:52:31.211807 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5"] Dec 04 17:52:31 crc kubenswrapper[4948]: W1204 17:52:31.225455 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec94845_de1c_4ed1_b588_7cbe115fb1d7.slice/crio-4ecdbf3954ce55a4227f3ba6f61ea5e941ca98dc7d6b4f0649ac50f82fd68a23 WatchSource:0}: Error finding container 4ecdbf3954ce55a4227f3ba6f61ea5e941ca98dc7d6b4f0649ac50f82fd68a23: Status 404 returned error can't find the container with id 4ecdbf3954ce55a4227f3ba6f61ea5e941ca98dc7d6b4f0649ac50f82fd68a23 Dec 04 17:52:32 crc kubenswrapper[4948]: I1204 17:52:32.110760 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" event={"ID":"3ec94845-de1c-4ed1-b588-7cbe115fb1d7","Type":"ContainerStarted","Data":"4ecdbf3954ce55a4227f3ba6f61ea5e941ca98dc7d6b4f0649ac50f82fd68a23"} Dec 04 17:52:36 crc kubenswrapper[4948]: I1204 17:52:36.134292 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" event={"ID":"3ec94845-de1c-4ed1-b588-7cbe115fb1d7","Type":"ContainerStarted","Data":"35a4d11083bfbcb0fd3272c2f671fbad951ab774b695b4f48db89fbc46ecab22"} Dec 04 17:52:36 crc kubenswrapper[4948]: I1204 17:52:36.135916 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" Dec 04 17:52:36 crc kubenswrapper[4948]: I1204 17:52:36.171639 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" podStartSLOduration=1.9106670220000002 podStartE2EDuration="6.17161483s" podCreationTimestamp="2025-12-04 17:52:30 +0000 UTC" firstStartedPulling="2025-12-04 17:52:31.228433242 +0000 UTC m=+1562.589507644" lastFinishedPulling="2025-12-04 17:52:35.48938105 +0000 UTC m=+1566.850455452" observedRunningTime="2025-12-04 17:52:36.165736218 +0000 UTC m=+1567.526810640" watchObservedRunningTime="2025-12-04 17:52:36.17161483 +0000 UTC m=+1567.532689282" Dec 04 17:52:40 crc kubenswrapper[4948]: I1204 17:52:40.624915 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:52:40 crc kubenswrapper[4948]: I1204 17:52:40.625691 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:52:40 crc kubenswrapper[4948]: I1204 17:52:40.625769 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 17:52:40 crc kubenswrapper[4948]: I1204 17:52:40.626853 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:52:40 crc kubenswrapper[4948]: I1204 17:52:40.626969 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" gracePeriod=600 Dec 04 17:52:40 crc kubenswrapper[4948]: I1204 17:52:40.757225 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-66bcc8f984-mrlv5" Dec 04 17:52:40 crc kubenswrapper[4948]: E1204 17:52:40.760981 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c5bb3e4_2f5a_47d7_a998_be50d1429cb2.slice/crio-conmon-3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5.scope\": RecentStats: unable to find data in memory cache]" Dec 04 17:52:40 crc kubenswrapper[4948]: E1204 17:52:40.767306 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:52:41 crc kubenswrapper[4948]: I1204 17:52:41.168170 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" exitCode=0 Dec 04 17:52:41 crc kubenswrapper[4948]: I1204 17:52:41.168254 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5"} Dec 04 17:52:41 crc kubenswrapper[4948]: I1204 17:52:41.168612 4948 scope.go:117] "RemoveContainer" containerID="e233a346ade9cd965a009c66f42b1cc18967a3cc196c7fca4634b21c2b68b2ec" Dec 04 17:52:41 crc kubenswrapper[4948]: I1204 17:52:41.169484 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:52:41 crc kubenswrapper[4948]: E1204 17:52:41.169934 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:52:52 crc kubenswrapper[4948]: I1204 17:52:52.913970 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:52:52 crc kubenswrapper[4948]: E1204 17:52:52.914757 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:53:04 crc kubenswrapper[4948]: I1204 17:53:04.914176 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:53:04 crc kubenswrapper[4948]: E1204 17:53:04.914850 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:53:16 crc kubenswrapper[4948]: I1204 17:53:16.913968 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:53:16 crc kubenswrapper[4948]: E1204 17:53:16.914571 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.142812 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.144286 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.146471 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-x22h8" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.153375 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.154510 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.157619 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8vbk9" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.167134 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.212264 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.243792 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.245006 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.245949 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.246286 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.249038 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7kr8c" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.256637 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pmv8l" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.274109 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.290271 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.292015 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj72b\" (UniqueName: \"kubernetes.io/projected/4fde973d-7944-478b-a53d-6cbfdbce85e6-kube-api-access-pj72b\") pod \"barbican-operator-controller-manager-7d9dfd778-xnn4n\" (UID: \"4fde973d-7944-478b-a53d-6cbfdbce85e6\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.292102 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srg9s\" (UniqueName: \"kubernetes.io/projected/332ef640-0de7-423e-a7d0-39637d3b4ada-kube-api-access-srg9s\") pod \"cinder-operator-controller-manager-859b6ccc6-4lgxs\" (UID: \"332ef640-0de7-423e-a7d0-39637d3b4ada\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.296847 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.297833 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.302436 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qtq6b" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.322432 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.323428 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.328078 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.331654 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-gbt9z" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.348101 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.352759 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.353790 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.357751 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2p2s6" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.357944 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.381615 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.386466 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.387534 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.393768 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqxtd\" (UniqueName: \"kubernetes.io/projected/4bb211de-d340-4f3d-999f-d0759663fc73-kube-api-access-wqxtd\") pod \"glance-operator-controller-manager-77987cd8cd-hvw8z\" (UID: \"4bb211de-d340-4f3d-999f-d0759663fc73\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.393843 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvn5b\" (UniqueName: \"kubernetes.io/projected/3c59027a-d806-4798-8338-a2ea5c9ba1ba-kube-api-access-gvn5b\") pod \"designate-operator-controller-manager-78b4bc895b-gb89j\" (UID: \"3c59027a-d806-4798-8338-a2ea5c9ba1ba\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.393875 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srg9s\" (UniqueName: \"kubernetes.io/projected/332ef640-0de7-423e-a7d0-39637d3b4ada-kube-api-access-srg9s\") pod \"cinder-operator-controller-manager-859b6ccc6-4lgxs\" (UID: \"332ef640-0de7-423e-a7d0-39637d3b4ada\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.393920 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj72b\" (UniqueName: \"kubernetes.io/projected/4fde973d-7944-478b-a53d-6cbfdbce85e6-kube-api-access-pj72b\") pod \"barbican-operator-controller-manager-7d9dfd778-xnn4n\" (UID: \"4fde973d-7944-478b-a53d-6cbfdbce85e6\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.399260 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-4w6gg" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.402395 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.411182 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.412141 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.414679 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mnv2j" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.431188 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj72b\" (UniqueName: \"kubernetes.io/projected/4fde973d-7944-478b-a53d-6cbfdbce85e6-kube-api-access-pj72b\") pod \"barbican-operator-controller-manager-7d9dfd778-xnn4n\" (UID: \"4fde973d-7944-478b-a53d-6cbfdbce85e6\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.431804 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.438765 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.454226 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.455784 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.456663 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-c4r4r" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.461164 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srg9s\" (UniqueName: \"kubernetes.io/projected/332ef640-0de7-423e-a7d0-39637d3b4ada-kube-api-access-srg9s\") pod \"cinder-operator-controller-manager-859b6ccc6-4lgxs\" (UID: \"332ef640-0de7-423e-a7d0-39637d3b4ada\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.462333 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.466167 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.467504 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.471733 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.473123 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fbhps" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.473478 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-t7zwk" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.477173 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.496656 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxtd\" (UniqueName: \"kubernetes.io/projected/4bb211de-d340-4f3d-999f-d0759663fc73-kube-api-access-wqxtd\") pod \"glance-operator-controller-manager-77987cd8cd-hvw8z\" (UID: \"4bb211de-d340-4f3d-999f-d0759663fc73\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.496724 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg2xm\" (UniqueName: \"kubernetes.io/projected/03cc6bc7-10ac-4521-9688-bbff0633f05a-kube-api-access-sg2xm\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.496752 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvn5b\" (UniqueName: \"kubernetes.io/projected/3c59027a-d806-4798-8338-a2ea5c9ba1ba-kube-api-access-gvn5b\") pod \"designate-operator-controller-manager-78b4bc895b-gb89j\" (UID: \"3c59027a-d806-4798-8338-a2ea5c9ba1ba\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.496770 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvph6\" (UniqueName: \"kubernetes.io/projected/8000563f-fa21-4755-8434-fc5c4e25cd99-kube-api-access-lvph6\") pod \"horizon-operator-controller-manager-68c6d99b8f-2wn6p\" (UID: \"8000563f-fa21-4755-8434-fc5c4e25cd99\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.496794 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5ns5\" (UniqueName: \"kubernetes.io/projected/cb34830b-da24-4d66-b3ca-136506c4ef7b-kube-api-access-s5ns5\") pod \"heat-operator-controller-manager-5f64f6f8bb-ff5cf\" (UID: \"cb34830b-da24-4d66-b3ca-136506c4ef7b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.496824 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjf66\" (UniqueName: \"kubernetes.io/projected/0740d32c-babe-4471-9f15-211080e05cbb-kube-api-access-jjf66\") pod \"ironic-operator-controller-manager-6c548fd776-pxvwh\" (UID: \"0740d32c-babe-4471-9f15-211080e05cbb\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.496850 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.504273 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.508256 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.522189 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.525748 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqxtd\" (UniqueName: \"kubernetes.io/projected/4bb211de-d340-4f3d-999f-d0759663fc73-kube-api-access-wqxtd\") pod \"glance-operator-controller-manager-77987cd8cd-hvw8z\" (UID: \"4bb211de-d340-4f3d-999f-d0759663fc73\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.530856 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.532347 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.537621 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvn5b\" (UniqueName: \"kubernetes.io/projected/3c59027a-d806-4798-8338-a2ea5c9ba1ba-kube-api-access-gvn5b\") pod \"designate-operator-controller-manager-78b4bc895b-gb89j\" (UID: \"3c59027a-d806-4798-8338-a2ea5c9ba1ba\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.539528 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-s7fqb" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.542450 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.553352 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-5g87l"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.563244 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.565812 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-5g87l"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.570737 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-7kvwc" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.577188 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.585738 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.589030 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.591067 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-9pzkl" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.591266 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.592839 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.596120 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597424 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597478 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bxx8\" (UniqueName: \"kubernetes.io/projected/98b7f9ee-20c5-4821-9841-c44b60650d4e-kube-api-access-4bxx8\") pod \"keystone-operator-controller-manager-7765d96ddf-p7mxr\" (UID: \"98b7f9ee-20c5-4821-9841-c44b60650d4e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597514 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6h7p\" (UniqueName: \"kubernetes.io/projected/d67b2a10-118c-4a3a-8cc8-a5dc33a92896-kube-api-access-d6h7p\") pod \"manila-operator-controller-manager-7c79b5df47-pvwlh\" (UID: \"d67b2a10-118c-4a3a-8cc8-a5dc33a92896\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597538 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg2xm\" (UniqueName: \"kubernetes.io/projected/03cc6bc7-10ac-4521-9688-bbff0633f05a-kube-api-access-sg2xm\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597557 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wggm5\" (UniqueName: \"kubernetes.io/projected/c9f08351-bf2d-4272-a43e-c8770c413a7c-kube-api-access-wggm5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-tx6s4\" (UID: \"c9f08351-bf2d-4272-a43e-c8770c413a7c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597579 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvph6\" (UniqueName: \"kubernetes.io/projected/8000563f-fa21-4755-8434-fc5c4e25cd99-kube-api-access-lvph6\") pod \"horizon-operator-controller-manager-68c6d99b8f-2wn6p\" (UID: \"8000563f-fa21-4755-8434-fc5c4e25cd99\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597598 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h47nn\" (UniqueName: \"kubernetes.io/projected/d228e40b-4c01-4794-a80e-7b77ec37ba2b-kube-api-access-h47nn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-89ndh\" (UID: \"d228e40b-4c01-4794-a80e-7b77ec37ba2b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597619 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5ns5\" (UniqueName: \"kubernetes.io/projected/cb34830b-da24-4d66-b3ca-136506c4ef7b-kube-api-access-s5ns5\") pod \"heat-operator-controller-manager-5f64f6f8bb-ff5cf\" (UID: \"cb34830b-da24-4d66-b3ca-136506c4ef7b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.597647 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjf66\" (UniqueName: \"kubernetes.io/projected/0740d32c-babe-4471-9f15-211080e05cbb-kube-api-access-jjf66\") pod \"ironic-operator-controller-manager-6c548fd776-pxvwh\" (UID: \"0740d32c-babe-4471-9f15-211080e05cbb\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" Dec 04 17:53:18 crc kubenswrapper[4948]: E1204 17:53:18.597969 4948 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:18 crc kubenswrapper[4948]: E1204 17:53:18.598014 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert podName:03cc6bc7-10ac-4521-9688-bbff0633f05a nodeName:}" failed. No retries permitted until 2025-12-04 17:53:19.09800028 +0000 UTC m=+1610.459074672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert") pod "infra-operator-controller-manager-57548d458d-k8zrd" (UID: "03cc6bc7-10ac-4521-9688-bbff0633f05a") : secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.600671 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-62tjn" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.601479 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.624539 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg2xm\" (UniqueName: \"kubernetes.io/projected/03cc6bc7-10ac-4521-9688-bbff0633f05a-kube-api-access-sg2xm\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.644905 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvph6\" (UniqueName: \"kubernetes.io/projected/8000563f-fa21-4755-8434-fc5c4e25cd99-kube-api-access-lvph6\") pod \"horizon-operator-controller-manager-68c6d99b8f-2wn6p\" (UID: \"8000563f-fa21-4755-8434-fc5c4e25cd99\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.650753 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.662377 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-2hntl"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.690799 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.692051 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5ns5\" (UniqueName: \"kubernetes.io/projected/cb34830b-da24-4d66-b3ca-136506c4ef7b-kube-api-access-s5ns5\") pod \"heat-operator-controller-manager-5f64f6f8bb-ff5cf\" (UID: \"cb34830b-da24-4d66-b3ca-136506c4ef7b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.696767 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-8rnq5" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.698915 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.699110 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bxx8\" (UniqueName: \"kubernetes.io/projected/98b7f9ee-20c5-4821-9841-c44b60650d4e-kube-api-access-4bxx8\") pod \"keystone-operator-controller-manager-7765d96ddf-p7mxr\" (UID: \"98b7f9ee-20c5-4821-9841-c44b60650d4e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.699209 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgncz\" (UniqueName: \"kubernetes.io/projected/a118aaa1-bd32-4cbb-bc4b-6561faeca58b-kube-api-access-sgncz\") pod \"ovn-operator-controller-manager-b6456fdb6-jlxw2\" (UID: \"a118aaa1-bd32-4cbb-bc4b-6561faeca58b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.699289 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42p7m\" (UniqueName: \"kubernetes.io/projected/f621ca2b-bd4b-41a2-b11a-985f094886b1-kube-api-access-42p7m\") pod \"nova-operator-controller-manager-697bc559fc-p5t6c\" (UID: \"f621ca2b-bd4b-41a2-b11a-985f094886b1\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.699376 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6h7p\" (UniqueName: \"kubernetes.io/projected/d67b2a10-118c-4a3a-8cc8-a5dc33a92896-kube-api-access-d6h7p\") pod \"manila-operator-controller-manager-7c79b5df47-pvwlh\" (UID: \"d67b2a10-118c-4a3a-8cc8-a5dc33a92896\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.699455 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jvw\" (UniqueName: \"kubernetes.io/projected/bf2472b9-2441-4c7b-9d50-928f8dc38c78-kube-api-access-w9jvw\") pod \"octavia-operator-controller-manager-998648c74-5g87l\" (UID: \"bf2472b9-2441-4c7b-9d50-928f8dc38c78\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.699559 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wggm5\" (UniqueName: \"kubernetes.io/projected/c9f08351-bf2d-4272-a43e-c8770c413a7c-kube-api-access-wggm5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-tx6s4\" (UID: \"c9f08351-bf2d-4272-a43e-c8770c413a7c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.699665 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws9jg\" (UniqueName: \"kubernetes.io/projected/0ef37c1f-0fdf-43bd-81cf-4a359b671653-kube-api-access-ws9jg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.699746 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h47nn\" (UniqueName: \"kubernetes.io/projected/d228e40b-4c01-4794-a80e-7b77ec37ba2b-kube-api-access-h47nn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-89ndh\" (UID: \"d228e40b-4c01-4794-a80e-7b77ec37ba2b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.714793 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.720721 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjf66\" (UniqueName: \"kubernetes.io/projected/0740d32c-babe-4471-9f15-211080e05cbb-kube-api-access-jjf66\") pod \"ironic-operator-controller-manager-6c548fd776-pxvwh\" (UID: \"0740d32c-babe-4471-9f15-211080e05cbb\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.723648 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bxx8\" (UniqueName: \"kubernetes.io/projected/98b7f9ee-20c5-4821-9841-c44b60650d4e-kube-api-access-4bxx8\") pod \"keystone-operator-controller-manager-7765d96ddf-p7mxr\" (UID: \"98b7f9ee-20c5-4821-9841-c44b60650d4e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.725607 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h47nn\" (UniqueName: \"kubernetes.io/projected/d228e40b-4c01-4794-a80e-7b77ec37ba2b-kube-api-access-h47nn\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-89ndh\" (UID: \"d228e40b-4c01-4794-a80e-7b77ec37ba2b\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.726216 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wggm5\" (UniqueName: \"kubernetes.io/projected/c9f08351-bf2d-4272-a43e-c8770c413a7c-kube-api-access-wggm5\") pod \"mariadb-operator-controller-manager-56bbcc9d85-tx6s4\" (UID: \"c9f08351-bf2d-4272-a43e-c8770c413a7c\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.734973 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.735956 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.736307 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6h7p\" (UniqueName: \"kubernetes.io/projected/d67b2a10-118c-4a3a-8cc8-a5dc33a92896-kube-api-access-d6h7p\") pod \"manila-operator-controller-manager-7c79b5df47-pvwlh\" (UID: \"d67b2a10-118c-4a3a-8cc8-a5dc33a92896\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.743257 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-d6df7" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.743475 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.753357 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.764424 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-2hntl"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.792022 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.802860 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.802941 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgncz\" (UniqueName: \"kubernetes.io/projected/a118aaa1-bd32-4cbb-bc4b-6561faeca58b-kube-api-access-sgncz\") pod \"ovn-operator-controller-manager-b6456fdb6-jlxw2\" (UID: \"a118aaa1-bd32-4cbb-bc4b-6561faeca58b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.802966 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42p7m\" (UniqueName: \"kubernetes.io/projected/f621ca2b-bd4b-41a2-b11a-985f094886b1-kube-api-access-42p7m\") pod \"nova-operator-controller-manager-697bc559fc-p5t6c\" (UID: \"f621ca2b-bd4b-41a2-b11a-985f094886b1\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.802987 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wpw\" (UniqueName: \"kubernetes.io/projected/cb78bf36-4988-4814-b6a5-cf5c869eaee6-kube-api-access-74wpw\") pod \"placement-operator-controller-manager-78f8948974-2hntl\" (UID: \"cb78bf36-4988-4814-b6a5-cf5c869eaee6\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.803012 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jvw\" (UniqueName: \"kubernetes.io/projected/bf2472b9-2441-4c7b-9d50-928f8dc38c78-kube-api-access-w9jvw\") pod \"octavia-operator-controller-manager-998648c74-5g87l\" (UID: \"bf2472b9-2441-4c7b-9d50-928f8dc38c78\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.803057 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws9jg\" (UniqueName: \"kubernetes.io/projected/0ef37c1f-0fdf-43bd-81cf-4a359b671653-kube-api-access-ws9jg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:18 crc kubenswrapper[4948]: E1204 17:53:18.803402 4948 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:18 crc kubenswrapper[4948]: E1204 17:53:18.803439 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert podName:0ef37c1f-0fdf-43bd-81cf-4a359b671653 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:19.303426146 +0000 UTC m=+1610.664500548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" (UID: "0ef37c1f-0fdf-43bd-81cf-4a359b671653") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.826840 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws9jg\" (UniqueName: \"kubernetes.io/projected/0ef37c1f-0fdf-43bd-81cf-4a359b671653-kube-api-access-ws9jg\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.826835 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgncz\" (UniqueName: \"kubernetes.io/projected/a118aaa1-bd32-4cbb-bc4b-6561faeca58b-kube-api-access-sgncz\") pod \"ovn-operator-controller-manager-b6456fdb6-jlxw2\" (UID: \"a118aaa1-bd32-4cbb-bc4b-6561faeca58b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.841106 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.842449 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.842816 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.859402 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42p7m\" (UniqueName: \"kubernetes.io/projected/f621ca2b-bd4b-41a2-b11a-985f094886b1-kube-api-access-42p7m\") pod \"nova-operator-controller-manager-697bc559fc-p5t6c\" (UID: \"f621ca2b-bd4b-41a2-b11a-985f094886b1\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.860164 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.862562 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.864649 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-lt9c2" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.866810 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jvw\" (UniqueName: \"kubernetes.io/projected/bf2472b9-2441-4c7b-9d50-928f8dc38c78-kube-api-access-w9jvw\") pod \"octavia-operator-controller-manager-998648c74-5g87l\" (UID: \"bf2472b9-2441-4c7b-9d50-928f8dc38c78\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.871907 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.887486 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.895367 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.904246 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74wpw\" (UniqueName: \"kubernetes.io/projected/cb78bf36-4988-4814-b6a5-cf5c869eaee6-kube-api-access-74wpw\") pod \"placement-operator-controller-manager-78f8948974-2hntl\" (UID: \"cb78bf36-4988-4814-b6a5-cf5c869eaee6\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.904313 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phjzz\" (UniqueName: \"kubernetes.io/projected/07646f24-8e16-4202-b2f9-ac13a751235e-kube-api-access-phjzz\") pod \"telemetry-operator-controller-manager-76cc84c6bb-9dfcj\" (UID: \"07646f24-8e16-4202-b2f9-ac13a751235e\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.904345 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8xcl\" (UniqueName: \"kubernetes.io/projected/e1097cb0-78f6-49a6-87d2-4aa88fb31f58-kube-api-access-s8xcl\") pod \"swift-operator-controller-manager-5f8c65bbfc-zqk8k\" (UID: \"e1097cb0-78f6-49a6-87d2-4aa88fb31f58\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.925859 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.930029 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wpw\" (UniqueName: \"kubernetes.io/projected/cb78bf36-4988-4814-b6a5-cf5c869eaee6-kube-api-access-74wpw\") pod \"placement-operator-controller-manager-78f8948974-2hntl\" (UID: \"cb78bf36-4988-4814-b6a5-cf5c869eaee6\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.935153 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.937970 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fvll5"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.940189 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fvll5"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.940275 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.942163 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.942755 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-bhxg5" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.945728 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.961494 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.964712 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2csvw" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.976328 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b"] Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.977185 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.991294 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.991511 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rhk98" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.991611 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 04 17:53:18 crc kubenswrapper[4948]: I1204 17:53:18.991691 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.006619 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm5z4\" (UniqueName: \"kubernetes.io/projected/bd023a53-5fe4-4660-aee2-c8565808da2f-kube-api-access-rm5z4\") pod \"watcher-operator-controller-manager-769dc69bc-wrj57\" (UID: \"bd023a53-5fe4-4660-aee2-c8565808da2f\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.006678 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnt7v\" (UniqueName: \"kubernetes.io/projected/f5180167-c92a-4f8a-b924-ee9d6e080261-kube-api-access-tnt7v\") pod \"test-operator-controller-manager-5854674fcc-fvll5\" (UID: \"f5180167-c92a-4f8a-b924-ee9d6e080261\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.006728 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phjzz\" (UniqueName: \"kubernetes.io/projected/07646f24-8e16-4202-b2f9-ac13a751235e-kube-api-access-phjzz\") pod \"telemetry-operator-controller-manager-76cc84c6bb-9dfcj\" (UID: \"07646f24-8e16-4202-b2f9-ac13a751235e\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.006763 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8xcl\" (UniqueName: \"kubernetes.io/projected/e1097cb0-78f6-49a6-87d2-4aa88fb31f58-kube-api-access-s8xcl\") pod \"swift-operator-controller-manager-5f8c65bbfc-zqk8k\" (UID: \"e1097cb0-78f6-49a6-87d2-4aa88fb31f58\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.018462 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.026903 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.042276 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.042371 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.047312 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8dpq7" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.047685 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.048637 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phjzz\" (UniqueName: \"kubernetes.io/projected/07646f24-8e16-4202-b2f9-ac13a751235e-kube-api-access-phjzz\") pod \"telemetry-operator-controller-manager-76cc84c6bb-9dfcj\" (UID: \"07646f24-8e16-4202-b2f9-ac13a751235e\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.076115 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8xcl\" (UniqueName: \"kubernetes.io/projected/e1097cb0-78f6-49a6-87d2-4aa88fb31f58-kube-api-access-s8xcl\") pod \"swift-operator-controller-manager-5f8c65bbfc-zqk8k\" (UID: \"e1097cb0-78f6-49a6-87d2-4aa88fb31f58\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.077899 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.110085 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.110146 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm5z4\" (UniqueName: \"kubernetes.io/projected/bd023a53-5fe4-4660-aee2-c8565808da2f-kube-api-access-rm5z4\") pod \"watcher-operator-controller-manager-769dc69bc-wrj57\" (UID: \"bd023a53-5fe4-4660-aee2-c8565808da2f\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.110177 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p98v4\" (UniqueName: \"kubernetes.io/projected/80927c44-1bff-48a7-8f3a-25ca44033176-kube-api-access-p98v4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wgqdn\" (UID: \"80927c44-1bff-48a7-8f3a-25ca44033176\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.110214 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.110232 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pgvn\" (UniqueName: \"kubernetes.io/projected/e1c25561-350e-4093-8f84-17a631b22d36-kube-api-access-7pgvn\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.110249 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnt7v\" (UniqueName: \"kubernetes.io/projected/f5180167-c92a-4f8a-b924-ee9d6e080261-kube-api-access-tnt7v\") pod \"test-operator-controller-manager-5854674fcc-fvll5\" (UID: \"f5180167-c92a-4f8a-b924-ee9d6e080261\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.110283 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.110416 4948 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.110491 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert podName:03cc6bc7-10ac-4521-9688-bbff0633f05a nodeName:}" failed. No retries permitted until 2025-12-04 17:53:20.110473494 +0000 UTC m=+1611.471547896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert") pod "infra-operator-controller-manager-57548d458d-k8zrd" (UID: "03cc6bc7-10ac-4521-9688-bbff0633f05a") : secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.121882 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.154097 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm5z4\" (UniqueName: \"kubernetes.io/projected/bd023a53-5fe4-4660-aee2-c8565808da2f-kube-api-access-rm5z4\") pod \"watcher-operator-controller-manager-769dc69bc-wrj57\" (UID: \"bd023a53-5fe4-4660-aee2-c8565808da2f\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.159018 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnt7v\" (UniqueName: \"kubernetes.io/projected/f5180167-c92a-4f8a-b924-ee9d6e080261-kube-api-access-tnt7v\") pod \"test-operator-controller-manager-5854674fcc-fvll5\" (UID: \"f5180167-c92a-4f8a-b924-ee9d6e080261\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.193749 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.233108 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.233822 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p98v4\" (UniqueName: \"kubernetes.io/projected/80927c44-1bff-48a7-8f3a-25ca44033176-kube-api-access-p98v4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wgqdn\" (UID: \"80927c44-1bff-48a7-8f3a-25ca44033176\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.233872 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.233917 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pgvn\" (UniqueName: \"kubernetes.io/projected/e1c25561-350e-4093-8f84-17a631b22d36-kube-api-access-7pgvn\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.233988 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.234231 4948 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.234297 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:19.734275404 +0000 UTC m=+1611.095349806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "metrics-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.234773 4948 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.234805 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:19.734797081 +0000 UTC m=+1611.095871483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "webhook-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.246268 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.282293 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pgvn\" (UniqueName: \"kubernetes.io/projected/e1c25561-350e-4093-8f84-17a631b22d36-kube-api-access-7pgvn\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.300570 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.301507 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p98v4\" (UniqueName: \"kubernetes.io/projected/80927c44-1bff-48a7-8f3a-25ca44033176-kube-api-access-p98v4\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wgqdn\" (UID: \"80927c44-1bff-48a7-8f3a-25ca44033176\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.328607 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.335744 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.335910 4948 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.335956 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert podName:0ef37c1f-0fdf-43bd-81cf-4a359b671653 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:20.335940557 +0000 UTC m=+1611.697014959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" (UID: "0ef37c1f-0fdf-43bd-81cf-4a359b671653") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.358182 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.384819 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.392646 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.409179 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.483314 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" event={"ID":"98b7f9ee-20c5-4821-9841-c44b60650d4e","Type":"ContainerStarted","Data":"49b734a4bfff1f30481550a145b4f14d38a0ad4f669735566443636fe3a49ced"} Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.506415 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" event={"ID":"4fde973d-7944-478b-a53d-6cbfdbce85e6","Type":"ContainerStarted","Data":"47eace0981051790fe67888cdbd8d089b08fa472d333e4648e91f54e035e8236"} Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.528288 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" event={"ID":"8000563f-fa21-4755-8434-fc5c4e25cd99","Type":"ContainerStarted","Data":"96bd0fbd68d8efe1c63e9ae4da080bafc59881a3239540f24e57a4bee643f278"} Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.531207 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" event={"ID":"4bb211de-d340-4f3d-999f-d0759663fc73","Type":"ContainerStarted","Data":"94d008bbd2bfd7e93616407066deed688dea9df9266e9183eb367c2df258801f"} Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.532386 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" event={"ID":"332ef640-0de7-423e-a7d0-39637d3b4ada","Type":"ContainerStarted","Data":"cd05c82b467de625eb15cd5dba97c483f4e9ebd168fa2633db0aab002dde5c93"} Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.690642 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.705596 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh"] Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.747986 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: I1204 17:53:19.748081 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.748263 4948 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.748317 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:20.748300439 +0000 UTC m=+1612.109374851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "metrics-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.748671 4948 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: E1204 17:53:19.748707 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:20.748697422 +0000 UTC m=+1612.109771824 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "webhook-server-cert" not found Dec 04 17:53:19 crc kubenswrapper[4948]: W1204 17:53:19.775604 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd67b2a10_118c_4a3a_8cc8_a5dc33a92896.slice/crio-784426a6f0582f7ac19b4f04104fc3d591eca8930efd8fc2559be3d5e8496b57 WatchSource:0}: Error finding container 784426a6f0582f7ac19b4f04104fc3d591eca8930efd8fc2559be3d5e8496b57: Status 404 returned error can't find the container with id 784426a6f0582f7ac19b4f04104fc3d591eca8930efd8fc2559be3d5e8496b57 Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.110206 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k"] Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.116514 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4"] Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.130612 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-5g87l"] Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.163376 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.163541 4948 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.163621 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert podName:03cc6bc7-10ac-4521-9688-bbff0633f05a nodeName:}" failed. No retries permitted until 2025-12-04 17:53:22.163600737 +0000 UTC m=+1613.524675139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert") pod "infra-operator-controller-manager-57548d458d-k8zrd" (UID: "03cc6bc7-10ac-4521-9688-bbff0633f05a") : secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.169745 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2"] Dec 04 17:53:20 crc kubenswrapper[4948]: W1204 17:53:20.181630 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda118aaa1_bd32_4cbb_bc4b_6561faeca58b.slice/crio-f5beaae65071510ab0f8f4f3c732154efefbdd0303366991796978d25a74fcc9 WatchSource:0}: Error finding container f5beaae65071510ab0f8f4f3c732154efefbdd0303366991796978d25a74fcc9: Status 404 returned error can't find the container with id f5beaae65071510ab0f8f4f3c732154efefbdd0303366991796978d25a74fcc9 Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.270069 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-2hntl"] Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.275459 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c"] Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.285575 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj"] Dec 04 17:53:20 crc kubenswrapper[4948]: W1204 17:53:20.295293 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb78bf36_4988_4814_b6a5_cf5c869eaee6.slice/crio-3f46253f61ff92bdd1a824fe218d21f6d46c3d6d8a1421098b8b5a7175fd2c62 WatchSource:0}: Error finding container 3f46253f61ff92bdd1a824fe218d21f6d46c3d6d8a1421098b8b5a7175fd2c62: Status 404 returned error can't find the container with id 3f46253f61ff92bdd1a824fe218d21f6d46c3d6d8a1421098b8b5a7175fd2c62 Dec 04 17:53:20 crc kubenswrapper[4948]: W1204 17:53:20.302270 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf621ca2b_bd4b_41a2_b11a_985f094886b1.slice/crio-340ec081c1978f311481328a1bc841885910331d15bd6a6c3f742d503efb3154 WatchSource:0}: Error finding container 340ec081c1978f311481328a1bc841885910331d15bd6a6c3f742d503efb3154: Status 404 returned error can't find the container with id 340ec081c1978f311481328a1bc841885910331d15bd6a6c3f742d503efb3154 Dec 04 17:53:20 crc kubenswrapper[4948]: W1204 17:53:20.302594 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07646f24_8e16_4202_b2f9_ac13a751235e.slice/crio-3c9ceb0685123c5f0c82fb80b6345e9f39ab97e1b0d7255568c5670857fe098b WatchSource:0}: Error finding container 3c9ceb0685123c5f0c82fb80b6345e9f39ab97e1b0d7255568c5670857fe098b: Status 404 returned error can't find the container with id 3c9ceb0685123c5f0c82fb80b6345e9f39ab97e1b0d7255568c5670857fe098b Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.366713 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.366928 4948 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.366981 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert podName:0ef37c1f-0fdf-43bd-81cf-4a359b671653 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:22.366967605 +0000 UTC m=+1613.728042007 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" (UID: "0ef37c1f-0fdf-43bd-81cf-4a359b671653") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.393742 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57"] Dec 04 17:53:20 crc kubenswrapper[4948]: W1204 17:53:20.406309 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd023a53_5fe4_4660_aee2_c8565808da2f.slice/crio-41047dccb1c15ec83ece8d5a40477a23e776de86c6dab01f40a847e0a74c809b WatchSource:0}: Error finding container 41047dccb1c15ec83ece8d5a40477a23e776de86c6dab01f40a847e0a74c809b: Status 404 returned error can't find the container with id 41047dccb1c15ec83ece8d5a40477a23e776de86c6dab01f40a847e0a74c809b Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.409628 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh"] Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.417074 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rm5z4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-wrj57_openstack-operators(bd023a53-5fe4-4660-aee2-c8565808da2f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.419598 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rm5z4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-wrj57_openstack-operators(bd023a53-5fe4-4660-aee2-c8565808da2f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.420773 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" podUID="bd023a53-5fe4-4660-aee2-c8565808da2f" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.425108 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fvll5"] Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.429138 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn"] Dec 04 17:53:20 crc kubenswrapper[4948]: W1204 17:53:20.432155 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5180167_c92a_4f8a_b924_ee9d6e080261.slice/crio-b13765ebcc79711580fc9c2d9c181ddf96e387a2699804f4d94dbc13d9a969ff WatchSource:0}: Error finding container b13765ebcc79711580fc9c2d9c181ddf96e387a2699804f4d94dbc13d9a969ff: Status 404 returned error can't find the container with id b13765ebcc79711580fc9c2d9c181ddf96e387a2699804f4d94dbc13d9a969ff Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.433563 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jjf66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-pxvwh_openstack-operators(0740d32c-babe-4471-9f15-211080e05cbb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.434431 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tnt7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-fvll5_openstack-operators(f5180167-c92a-4f8a-b924-ee9d6e080261): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.436004 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jjf66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-pxvwh_openstack-operators(0740d32c-babe-4471-9f15-211080e05cbb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.436748 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf"] Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.438567 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" podUID="0740d32c-babe-4471-9f15-211080e05cbb" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.453611 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tnt7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-fvll5_openstack-operators(f5180167-c92a-4f8a-b924-ee9d6e080261): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: W1204 17:53:20.454079 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb34830b_da24_4d66_b3ca_136506c4ef7b.slice/crio-53476e49a2df2f1a36257e4b7f5c6bd923e414c525a2663b1f0104ea34b19275 WatchSource:0}: Error finding container 53476e49a2df2f1a36257e4b7f5c6bd923e414c525a2663b1f0104ea34b19275: Status 404 returned error can't find the container with id 53476e49a2df2f1a36257e4b7f5c6bd923e414c525a2663b1f0104ea34b19275 Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.454854 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" podUID="f5180167-c92a-4f8a-b924-ee9d6e080261" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.457408 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p98v4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wgqdn_openstack-operators(80927c44-1bff-48a7-8f3a-25ca44033176): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.459226 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" podUID="80927c44-1bff-48a7-8f3a-25ca44033176" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.472080 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5ns5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-ff5cf_openstack-operators(cb34830b-da24-4d66-b3ca-136506c4ef7b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.476333 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5ns5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-ff5cf_openstack-operators(cb34830b-da24-4d66-b3ca-136506c4ef7b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.477650 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" podUID="cb34830b-da24-4d66-b3ca-136506c4ef7b" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.547689 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" event={"ID":"cb34830b-da24-4d66-b3ca-136506c4ef7b","Type":"ContainerStarted","Data":"53476e49a2df2f1a36257e4b7f5c6bd923e414c525a2663b1f0104ea34b19275"} Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.558251 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" podUID="cb34830b-da24-4d66-b3ca-136506c4ef7b" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.559394 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" event={"ID":"80927c44-1bff-48a7-8f3a-25ca44033176","Type":"ContainerStarted","Data":"790e0eca3ac540a22f22b3d56634b3fe5afe829088f57962205fef257b57fecb"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.561203 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" event={"ID":"d67b2a10-118c-4a3a-8cc8-a5dc33a92896","Type":"ContainerStarted","Data":"784426a6f0582f7ac19b4f04104fc3d591eca8930efd8fc2559be3d5e8496b57"} Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.561017 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" podUID="80927c44-1bff-48a7-8f3a-25ca44033176" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.566216 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" event={"ID":"bf2472b9-2441-4c7b-9d50-928f8dc38c78","Type":"ContainerStarted","Data":"dff83c870602f3e48b41f9dba28dd6bfc930c72b27989622b2778ff570fb3171"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.568260 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" event={"ID":"0740d32c-babe-4471-9f15-211080e05cbb","Type":"ContainerStarted","Data":"878d98321e9ba6ce9afd8f87eb95e64336089ea56570520ccd102266d07d093d"} Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.569852 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" podUID="0740d32c-babe-4471-9f15-211080e05cbb" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.570817 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" event={"ID":"07646f24-8e16-4202-b2f9-ac13a751235e","Type":"ContainerStarted","Data":"3c9ceb0685123c5f0c82fb80b6345e9f39ab97e1b0d7255568c5670857fe098b"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.575117 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" event={"ID":"cb78bf36-4988-4814-b6a5-cf5c869eaee6","Type":"ContainerStarted","Data":"3f46253f61ff92bdd1a824fe218d21f6d46c3d6d8a1421098b8b5a7175fd2c62"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.582440 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" event={"ID":"3c59027a-d806-4798-8338-a2ea5c9ba1ba","Type":"ContainerStarted","Data":"9efabd8e4be01c686e72ac0a162ecb5f106e3d25fdc7349ab85e09bebb967d4f"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.584744 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" event={"ID":"e1097cb0-78f6-49a6-87d2-4aa88fb31f58","Type":"ContainerStarted","Data":"590962f1b67845aa1fb098f77ed30a9c8c4d746ab28753a74a2dfb1de0c13fcd"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.607175 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" event={"ID":"c9f08351-bf2d-4272-a43e-c8770c413a7c","Type":"ContainerStarted","Data":"6b01ec5a7b818a65383949aacc4252628d3a4f2d560134d55060f185b82dd05d"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.611441 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" event={"ID":"d228e40b-4c01-4794-a80e-7b77ec37ba2b","Type":"ContainerStarted","Data":"14b9fcf497cc9ba61601ff03a3470374f6f8a3e56df34a1498f11c00d0a0cc5a"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.612427 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" event={"ID":"f621ca2b-bd4b-41a2-b11a-985f094886b1","Type":"ContainerStarted","Data":"340ec081c1978f311481328a1bc841885910331d15bd6a6c3f742d503efb3154"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.613339 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" event={"ID":"bd023a53-5fe4-4660-aee2-c8565808da2f","Type":"ContainerStarted","Data":"41047dccb1c15ec83ece8d5a40477a23e776de86c6dab01f40a847e0a74c809b"} Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.616760 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" event={"ID":"a118aaa1-bd32-4cbb-bc4b-6561faeca58b","Type":"ContainerStarted","Data":"f5beaae65071510ab0f8f4f3c732154efefbdd0303366991796978d25a74fcc9"} Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.617006 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" podUID="bd023a53-5fe4-4660-aee2-c8565808da2f" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.618865 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" event={"ID":"f5180167-c92a-4f8a-b924-ee9d6e080261","Type":"ContainerStarted","Data":"b13765ebcc79711580fc9c2d9c181ddf96e387a2699804f4d94dbc13d9a969ff"} Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.628286 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" podUID="f5180167-c92a-4f8a-b924-ee9d6e080261" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.774402 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:20 crc kubenswrapper[4948]: I1204 17:53:20.774502 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.774810 4948 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.774904 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:22.774879961 +0000 UTC m=+1614.135954423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "webhook-server-cert" not found Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.775318 4948 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 17:53:20 crc kubenswrapper[4948]: E1204 17:53:20.775359 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:22.775347746 +0000 UTC m=+1614.136422218 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "metrics-server-cert" not found Dec 04 17:53:21 crc kubenswrapper[4948]: E1204 17:53:21.631089 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" podUID="80927c44-1bff-48a7-8f3a-25ca44033176" Dec 04 17:53:21 crc kubenswrapper[4948]: E1204 17:53:21.632635 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" podUID="bd023a53-5fe4-4660-aee2-c8565808da2f" Dec 04 17:53:21 crc kubenswrapper[4948]: E1204 17:53:21.633448 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" podUID="0740d32c-babe-4471-9f15-211080e05cbb" Dec 04 17:53:21 crc kubenswrapper[4948]: E1204 17:53:21.633671 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" podUID="f5180167-c92a-4f8a-b924-ee9d6e080261" Dec 04 17:53:21 crc kubenswrapper[4948]: E1204 17:53:21.658467 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" podUID="cb34830b-da24-4d66-b3ca-136506c4ef7b" Dec 04 17:53:22 crc kubenswrapper[4948]: I1204 17:53:22.200465 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:22 crc kubenswrapper[4948]: E1204 17:53:22.200781 4948 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:22 crc kubenswrapper[4948]: E1204 17:53:22.200831 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert podName:03cc6bc7-10ac-4521-9688-bbff0633f05a nodeName:}" failed. No retries permitted until 2025-12-04 17:53:26.200814528 +0000 UTC m=+1617.561888920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert") pod "infra-operator-controller-manager-57548d458d-k8zrd" (UID: "03cc6bc7-10ac-4521-9688-bbff0633f05a") : secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:22 crc kubenswrapper[4948]: I1204 17:53:22.411554 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:22 crc kubenswrapper[4948]: E1204 17:53:22.411707 4948 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:22 crc kubenswrapper[4948]: E1204 17:53:22.411792 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert podName:0ef37c1f-0fdf-43bd-81cf-4a359b671653 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:26.411772192 +0000 UTC m=+1617.772846594 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" (UID: "0ef37c1f-0fdf-43bd-81cf-4a359b671653") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:22 crc kubenswrapper[4948]: I1204 17:53:22.818376 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:22 crc kubenswrapper[4948]: I1204 17:53:22.818525 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:22 crc kubenswrapper[4948]: E1204 17:53:22.818668 4948 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 17:53:22 crc kubenswrapper[4948]: E1204 17:53:22.818724 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:26.818706133 +0000 UTC m=+1618.179780535 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "webhook-server-cert" not found Dec 04 17:53:22 crc kubenswrapper[4948]: E1204 17:53:22.819349 4948 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 17:53:22 crc kubenswrapper[4948]: E1204 17:53:22.819382 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:26.819372301 +0000 UTC m=+1618.180446703 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "metrics-server-cert" not found Dec 04 17:53:26 crc kubenswrapper[4948]: I1204 17:53:26.271224 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:26 crc kubenswrapper[4948]: E1204 17:53:26.271399 4948 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:26 crc kubenswrapper[4948]: E1204 17:53:26.271610 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert podName:03cc6bc7-10ac-4521-9688-bbff0633f05a nodeName:}" failed. No retries permitted until 2025-12-04 17:53:34.271592043 +0000 UTC m=+1625.632666445 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert") pod "infra-operator-controller-manager-57548d458d-k8zrd" (UID: "03cc6bc7-10ac-4521-9688-bbff0633f05a") : secret "infra-operator-webhook-server-cert" not found Dec 04 17:53:26 crc kubenswrapper[4948]: I1204 17:53:26.487254 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:26 crc kubenswrapper[4948]: E1204 17:53:26.487646 4948 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:26 crc kubenswrapper[4948]: E1204 17:53:26.487715 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert podName:0ef37c1f-0fdf-43bd-81cf-4a359b671653 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:34.487697242 +0000 UTC m=+1625.848771664 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" (UID: "0ef37c1f-0fdf-43bd-81cf-4a359b671653") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:26 crc kubenswrapper[4948]: I1204 17:53:26.893266 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:26 crc kubenswrapper[4948]: I1204 17:53:26.893363 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:26 crc kubenswrapper[4948]: E1204 17:53:26.893522 4948 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 17:53:26 crc kubenswrapper[4948]: E1204 17:53:26.893553 4948 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 04 17:53:26 crc kubenswrapper[4948]: E1204 17:53:26.893609 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:34.893591673 +0000 UTC m=+1626.254666075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "metrics-server-cert" not found Dec 04 17:53:26 crc kubenswrapper[4948]: E1204 17:53:26.893627 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:34.893619214 +0000 UTC m=+1626.254693616 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "webhook-server-cert" not found Dec 04 17:53:29 crc kubenswrapper[4948]: I1204 17:53:29.913306 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:53:29 crc kubenswrapper[4948]: E1204 17:53:29.913974 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:53:32 crc kubenswrapper[4948]: E1204 17:53:32.937372 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 04 17:53:32 crc kubenswrapper[4948]: E1204 17:53:32.937814 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wggm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-tx6s4_openstack-operators(c9f08351-bf2d-4272-a43e-c8770c413a7c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:53:34 crc kubenswrapper[4948]: I1204 17:53:34.320823 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:34 crc kubenswrapper[4948]: I1204 17:53:34.332252 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03cc6bc7-10ac-4521-9688-bbff0633f05a-cert\") pod \"infra-operator-controller-manager-57548d458d-k8zrd\" (UID: \"03cc6bc7-10ac-4521-9688-bbff0633f05a\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:34 crc kubenswrapper[4948]: I1204 17:53:34.524265 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:34 crc kubenswrapper[4948]: E1204 17:53:34.524419 4948 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:34 crc kubenswrapper[4948]: E1204 17:53:34.524493 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert podName:0ef37c1f-0fdf-43bd-81cf-4a359b671653 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:50.524473366 +0000 UTC m=+1641.885547778 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" (UID: "0ef37c1f-0fdf-43bd-81cf-4a359b671653") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 17:53:34 crc kubenswrapper[4948]: I1204 17:53:34.576322 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2p2s6" Dec 04 17:53:34 crc kubenswrapper[4948]: I1204 17:53:34.585058 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:34 crc kubenswrapper[4948]: I1204 17:53:34.931969 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:34 crc kubenswrapper[4948]: I1204 17:53:34.932318 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:34 crc kubenswrapper[4948]: E1204 17:53:34.932610 4948 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 04 17:53:34 crc kubenswrapper[4948]: E1204 17:53:34.932677 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs podName:e1c25561-350e-4093-8f84-17a631b22d36 nodeName:}" failed. No retries permitted until 2025-12-04 17:53:50.932651359 +0000 UTC m=+1642.293725761 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs") pod "openstack-operator-controller-manager-f65bcfbd6-ksq9b" (UID: "e1c25561-350e-4093-8f84-17a631b22d36") : secret "webhook-server-cert" not found Dec 04 17:53:34 crc kubenswrapper[4948]: I1204 17:53:34.942165 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-metrics-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:43 crc kubenswrapper[4948]: E1204 17:53:43.484009 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 04 17:53:43 crc kubenswrapper[4948]: E1204 17:53:43.484594 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d6h7p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-pvwlh_openstack-operators(d67b2a10-118c-4a3a-8cc8-a5dc33a92896): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:53:43 crc kubenswrapper[4948]: I1204 17:53:43.914315 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:53:43 crc kubenswrapper[4948]: E1204 17:53:43.914678 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:53:43 crc kubenswrapper[4948]: E1204 17:53:43.952595 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 04 17:53:43 crc kubenswrapper[4948]: E1204 17:53:43.952813 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w9jvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-5g87l_openstack-operators(bf2472b9-2441-4c7b-9d50-928f8dc38c78): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:53:44 crc kubenswrapper[4948]: E1204 17:53:44.409264 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 04 17:53:44 crc kubenswrapper[4948]: E1204 17:53:44.409480 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4bxx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-p7mxr_openstack-operators(98b7f9ee-20c5-4821-9841-c44b60650d4e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:53:45 crc kubenswrapper[4948]: E1204 17:53:45.642180 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 04 17:53:45 crc kubenswrapper[4948]: E1204 17:53:45.642612 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-42p7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-p5t6c_openstack-operators(f621ca2b-bd4b-41a2-b11a-985f094886b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:53:47 crc kubenswrapper[4948]: I1204 17:53:47.849815 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd"] Dec 04 17:53:48 crc kubenswrapper[4948]: W1204 17:53:48.014998 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03cc6bc7_10ac_4521_9688_bbff0633f05a.slice/crio-20c00ae937d2997318bbfb2055b217b99d6517d5bd86e2aee5300c1b17edfb91 WatchSource:0}: Error finding container 20c00ae937d2997318bbfb2055b217b99d6517d5bd86e2aee5300c1b17edfb91: Status 404 returned error can't find the container with id 20c00ae937d2997318bbfb2055b217b99d6517d5bd86e2aee5300c1b17edfb91 Dec 04 17:53:48 crc kubenswrapper[4948]: I1204 17:53:48.829122 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" event={"ID":"03cc6bc7-10ac-4521-9688-bbff0633f05a","Type":"ContainerStarted","Data":"20c00ae937d2997318bbfb2055b217b99d6517d5bd86e2aee5300c1b17edfb91"} Dec 04 17:53:48 crc kubenswrapper[4948]: I1204 17:53:48.834522 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" event={"ID":"4fde973d-7944-478b-a53d-6cbfdbce85e6","Type":"ContainerStarted","Data":"69afd8a72acc9d9643686505474e1749a26565fb7538b47006a8b9d4f8cb4cd1"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.843436 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" event={"ID":"cb78bf36-4988-4814-b6a5-cf5c869eaee6","Type":"ContainerStarted","Data":"fa9cff5914db8cb6a137d3f0d4b5b45c3ef55479a6ee780893629ed6f7bb0248"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.845024 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" event={"ID":"3c59027a-d806-4798-8338-a2ea5c9ba1ba","Type":"ContainerStarted","Data":"1f4c1d88d5440b67e1621ade6bfd1b5680952e50e79cf6dd8a5d699117d2a315"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.846466 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" event={"ID":"4bb211de-d340-4f3d-999f-d0759663fc73","Type":"ContainerStarted","Data":"5bd0ef9363c2222e2147c8d11bda44a1fe1e6a4ec4afcae01b6d462fa2ddbc95"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.849880 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" event={"ID":"e1097cb0-78f6-49a6-87d2-4aa88fb31f58","Type":"ContainerStarted","Data":"7866f811fd3fbb90a2e53d9d499170ab5c2e8374b045032ccffe02622614ad80"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.853276 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" event={"ID":"f5180167-c92a-4f8a-b924-ee9d6e080261","Type":"ContainerStarted","Data":"5a189327f3928c1eda0fff32ad876f1524b91ad4f28f93b84a75afbcd97bfa47"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.854535 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" event={"ID":"07646f24-8e16-4202-b2f9-ac13a751235e","Type":"ContainerStarted","Data":"d3bb8e93d431daf94295a1d2448ebf0f84b41fa09ed61f7b36ea07a3bd290495"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.855860 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" event={"ID":"d228e40b-4c01-4794-a80e-7b77ec37ba2b","Type":"ContainerStarted","Data":"fa6d4cb24ccbfe271f6771af8bf57212fd69db0103f54ff97687613b3de9926f"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.857153 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" event={"ID":"8000563f-fa21-4755-8434-fc5c4e25cd99","Type":"ContainerStarted","Data":"36d0360a3f44450f80b3a702928bd47271b1910d88818115b752fa7c41798c89"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.859731 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" event={"ID":"332ef640-0de7-423e-a7d0-39637d3b4ada","Type":"ContainerStarted","Data":"9fb1ab88ef37888063657ceb42e75ebcd16bc405b6b30369697ec00c1460cfc2"} Dec 04 17:53:49 crc kubenswrapper[4948]: I1204 17:53:49.861477 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" event={"ID":"a118aaa1-bd32-4cbb-bc4b-6561faeca58b","Type":"ContainerStarted","Data":"775b2edc4453cba6620db26d6c087addd064e8d512e0ee9d9e1ad2442945f822"} Dec 04 17:53:50 crc kubenswrapper[4948]: I1204 17:53:50.560141 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:50 crc kubenswrapper[4948]: I1204 17:53:50.566754 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ef37c1f-0fdf-43bd-81cf-4a359b671653-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp\" (UID: \"0ef37c1f-0fdf-43bd-81cf-4a359b671653\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:50 crc kubenswrapper[4948]: I1204 17:53:50.797571 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-62tjn" Dec 04 17:53:50 crc kubenswrapper[4948]: I1204 17:53:50.802519 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:50 crc kubenswrapper[4948]: I1204 17:53:50.877419 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" event={"ID":"0740d32c-babe-4471-9f15-211080e05cbb","Type":"ContainerStarted","Data":"df55b9d8aab91590124868ab0cc8eff1c19b848e86aecd07cc7d0f515579acdf"} Dec 04 17:53:50 crc kubenswrapper[4948]: E1204 17:53:50.928418 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" podUID="c9f08351-bf2d-4272-a43e-c8770c413a7c" Dec 04 17:53:50 crc kubenswrapper[4948]: I1204 17:53:50.965911 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:50 crc kubenswrapper[4948]: I1204 17:53:50.972851 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e1c25561-350e-4093-8f84-17a631b22d36-webhook-certs\") pod \"openstack-operator-controller-manager-f65bcfbd6-ksq9b\" (UID: \"e1c25561-350e-4093-8f84-17a631b22d36\") " pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.177512 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rhk98" Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.182115 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.556439 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp"] Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.817372 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b"] Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.936725 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" event={"ID":"4bb211de-d340-4f3d-999f-d0759663fc73","Type":"ContainerStarted","Data":"1fe2134a83fd8e77eab159df1cbc137202ac6110edbdad1b9620b7385b806a4d"} Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.936916 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.949463 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" event={"ID":"e1097cb0-78f6-49a6-87d2-4aa88fb31f58","Type":"ContainerStarted","Data":"967373e0cd4469eef1b1ccbe018b17efddd0b4906c9f9d3614a0353c01289082"} Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.949632 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" Dec 04 17:53:51 crc kubenswrapper[4948]: E1204 17:53:51.958116 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" podUID="d67b2a10-118c-4a3a-8cc8-a5dc33a92896" Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.963546 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" event={"ID":"0ef37c1f-0fdf-43bd-81cf-4a359b671653","Type":"ContainerStarted","Data":"8c016e3c70903235768c752516f9d6349e2ade10c7f4f808ba51e02088e126cc"} Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.966488 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" event={"ID":"d228e40b-4c01-4794-a80e-7b77ec37ba2b","Type":"ContainerStarted","Data":"832e3643154d0dfddd13dd7f52280692bfce47acb336778cd0677ab1666a782c"} Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.967959 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.968984 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" podStartSLOduration=2.491372301 podStartE2EDuration="33.968969564s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:19.450718001 +0000 UTC m=+1610.811792403" lastFinishedPulling="2025-12-04 17:53:50.928315254 +0000 UTC m=+1642.289389666" observedRunningTime="2025-12-04 17:53:51.963666375 +0000 UTC m=+1643.324740787" watchObservedRunningTime="2025-12-04 17:53:51.968969564 +0000 UTC m=+1643.330043966" Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.969703 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" event={"ID":"bd023a53-5fe4-4660-aee2-c8565808da2f","Type":"ContainerStarted","Data":"0eb04b9bd6d81a22c36f43b2dfb3def52b24d59ffc9b0c8ff8e40b22a58132ab"} Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.969788 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" event={"ID":"bd023a53-5fe4-4660-aee2-c8565808da2f","Type":"ContainerStarted","Data":"e9dd083bb39836e14b16a35a7f55b4b70d4c78c7fab9d6bf04d07955cf36391a"} Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.970707 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.974247 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" event={"ID":"c9f08351-bf2d-4272-a43e-c8770c413a7c","Type":"ContainerStarted","Data":"c90d1e9dfde5e9f7fc6acf60cc124cb154e1214eb5cd8c9008aee2b3e20d1341"} Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.977738 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" event={"ID":"e1c25561-350e-4093-8f84-17a631b22d36","Type":"ContainerStarted","Data":"3ebe2eaf65357a1e7e34683b5161928a5bfd0af532ddd389a2513d881f4bfe91"} Dec 04 17:53:51 crc kubenswrapper[4948]: I1204 17:53:51.984832 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" podStartSLOduration=3.386517855 podStartE2EDuration="33.984815637s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.134336477 +0000 UTC m=+1611.495410879" lastFinishedPulling="2025-12-04 17:53:50.732634259 +0000 UTC m=+1642.093708661" observedRunningTime="2025-12-04 17:53:51.980062176 +0000 UTC m=+1643.341136588" watchObservedRunningTime="2025-12-04 17:53:51.984815637 +0000 UTC m=+1643.345890039" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.006560 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" event={"ID":"a118aaa1-bd32-4cbb-bc4b-6561faeca58b","Type":"ContainerStarted","Data":"d15877a3b719ea386046476c0ad7bee4ff74b9ff67ac15918a6057a6f4046401"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.007620 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.010256 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" podStartSLOduration=2.79996154 podStartE2EDuration="34.010240104s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:19.711943877 +0000 UTC m=+1611.073018279" lastFinishedPulling="2025-12-04 17:53:50.922222441 +0000 UTC m=+1642.283296843" observedRunningTime="2025-12-04 17:53:52.004912625 +0000 UTC m=+1643.365987037" watchObservedRunningTime="2025-12-04 17:53:52.010240104 +0000 UTC m=+1643.371314506" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.011132 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" event={"ID":"cb34830b-da24-4d66-b3ca-136506c4ef7b","Type":"ContainerStarted","Data":"9520fc1ae7ef68cc11c2b3e6545c2c125e768619a8693deb6aa6356c1433dfee"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.011180 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" event={"ID":"cb34830b-da24-4d66-b3ca-136506c4ef7b","Type":"ContainerStarted","Data":"991b50d6bdf3ed34d04756dcd8a272317b2e74ff22b95505504b90b145c56913"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.011834 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.030474 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" event={"ID":"0740d32c-babe-4471-9f15-211080e05cbb","Type":"ContainerStarted","Data":"68ec5828bba289dc4a527a344141bcee8150ee2251931c3296c72ab94dd3e781"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.031337 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.033811 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" podStartSLOduration=6.529944495 podStartE2EDuration="34.033792392s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.416893222 +0000 UTC m=+1611.777967624" lastFinishedPulling="2025-12-04 17:53:47.920741119 +0000 UTC m=+1639.281815521" observedRunningTime="2025-12-04 17:53:52.031707646 +0000 UTC m=+1643.392782058" watchObservedRunningTime="2025-12-04 17:53:52.033792392 +0000 UTC m=+1643.394866794" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.037491 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" event={"ID":"07646f24-8e16-4202-b2f9-ac13a751235e","Type":"ContainerStarted","Data":"ab1d433355bd53cb1f2069fe415bd354887628b4b57f23b97e829c2d55a8931b"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.038381 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" Dec 04 17:53:52 crc kubenswrapper[4948]: E1204 17:53:52.038552 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" podUID="98b7f9ee-20c5-4821-9841-c44b60650d4e" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.040359 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" event={"ID":"8000563f-fa21-4755-8434-fc5c4e25cd99","Type":"ContainerStarted","Data":"014bf8d1369d5d134097ada750b347d647c54edf75cfe75d376a1e9a49e58edd"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.040925 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.054893 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" event={"ID":"332ef640-0de7-423e-a7d0-39637d3b4ada","Type":"ContainerStarted","Data":"fa6602863cb763aaae5f5944793b05b0067cb1162902861aa95d7ad5baf9d5a2"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.055075 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.058664 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" event={"ID":"80927c44-1bff-48a7-8f3a-25ca44033176","Type":"ContainerStarted","Data":"ca3b7b0d29af5330a3f3d4435e9758050d10235b50fd034937fdd35193eab2ea"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.063404 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" event={"ID":"cb78bf36-4988-4814-b6a5-cf5c869eaee6","Type":"ContainerStarted","Data":"a82e754e435659830ee2506069b60fde07104d6f8d9de25b06248b31229d2f3d"} Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.064244 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" Dec 04 17:53:52 crc kubenswrapper[4948]: E1204 17:53:52.066586 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" podUID="f621ca2b-bd4b-41a2-b11a-985f094886b1" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.076398 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" podStartSLOduration=2.572070267 podStartE2EDuration="34.076374985s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:19.424202732 +0000 UTC m=+1610.785277134" lastFinishedPulling="2025-12-04 17:53:50.92850746 +0000 UTC m=+1642.289581852" observedRunningTime="2025-12-04 17:53:52.070086345 +0000 UTC m=+1643.431160747" watchObservedRunningTime="2025-12-04 17:53:52.076374985 +0000 UTC m=+1643.437449387" Dec 04 17:53:52 crc kubenswrapper[4948]: E1204 17:53:52.087109 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" podUID="bf2472b9-2441-4c7b-9d50-928f8dc38c78" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.101574 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" podStartSLOduration=6.533133864 podStartE2EDuration="34.101552874s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.471915577 +0000 UTC m=+1611.832989979" lastFinishedPulling="2025-12-04 17:53:48.040334577 +0000 UTC m=+1639.401408989" observedRunningTime="2025-12-04 17:53:52.095072658 +0000 UTC m=+1643.456147070" watchObservedRunningTime="2025-12-04 17:53:52.101552874 +0000 UTC m=+1643.462627276" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.120607 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" podStartSLOduration=3.49887044 podStartE2EDuration="34.120594589s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.306498652 +0000 UTC m=+1611.667573054" lastFinishedPulling="2025-12-04 17:53:50.928222801 +0000 UTC m=+1642.289297203" observedRunningTime="2025-12-04 17:53:52.116713316 +0000 UTC m=+1643.477787728" watchObservedRunningTime="2025-12-04 17:53:52.120594589 +0000 UTC m=+1643.481668991" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.151929 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" podStartSLOduration=2.756258566 podStartE2EDuration="34.151909443s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:19.415210067 +0000 UTC m=+1610.776284469" lastFinishedPulling="2025-12-04 17:53:50.810860944 +0000 UTC m=+1642.171935346" observedRunningTime="2025-12-04 17:53:52.147540805 +0000 UTC m=+1643.508615217" watchObservedRunningTime="2025-12-04 17:53:52.151909443 +0000 UTC m=+1643.512983845" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.166112 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" podStartSLOduration=6.559332346 podStartE2EDuration="34.166092964s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.433420424 +0000 UTC m=+1611.794494826" lastFinishedPulling="2025-12-04 17:53:48.040181032 +0000 UTC m=+1639.401255444" observedRunningTime="2025-12-04 17:53:52.164424921 +0000 UTC m=+1643.525499323" watchObservedRunningTime="2025-12-04 17:53:52.166092964 +0000 UTC m=+1643.527167366" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.186753 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" podStartSLOduration=3.579174955 podStartE2EDuration="34.186734389s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.184696509 +0000 UTC m=+1611.545770911" lastFinishedPulling="2025-12-04 17:53:50.792255943 +0000 UTC m=+1642.153330345" observedRunningTime="2025-12-04 17:53:52.181038079 +0000 UTC m=+1643.542112481" watchObservedRunningTime="2025-12-04 17:53:52.186734389 +0000 UTC m=+1643.547808791" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.207012 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" podStartSLOduration=3.565793077 podStartE2EDuration="34.206996193s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.297141616 +0000 UTC m=+1611.658216018" lastFinishedPulling="2025-12-04 17:53:50.938344732 +0000 UTC m=+1642.299419134" observedRunningTime="2025-12-04 17:53:52.202409937 +0000 UTC m=+1643.563484349" watchObservedRunningTime="2025-12-04 17:53:52.206996193 +0000 UTC m=+1643.568070595" Dec 04 17:53:52 crc kubenswrapper[4948]: I1204 17:53:52.230909 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wgqdn" podStartSLOduration=6.23559973 podStartE2EDuration="34.230885912s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.457261586 +0000 UTC m=+1611.818335978" lastFinishedPulling="2025-12-04 17:53:48.452547758 +0000 UTC m=+1639.813622160" observedRunningTime="2025-12-04 17:53:52.229369603 +0000 UTC m=+1643.590444005" watchObservedRunningTime="2025-12-04 17:53:52.230885912 +0000 UTC m=+1643.591960314" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.072134 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" event={"ID":"4fde973d-7944-478b-a53d-6cbfdbce85e6","Type":"ContainerStarted","Data":"eba051a17a75873b9cf5893955a26ac8326d245a58c5dc65ac1b7b585e9d644c"} Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.072503 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.074951 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.076032 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" event={"ID":"f621ca2b-bd4b-41a2-b11a-985f094886b1","Type":"ContainerStarted","Data":"ce56b167b812a5e8744d9abee869dfc82c2fb0cc77b92d3f89901b75db4f6413"} Dec 04 17:53:53 crc kubenswrapper[4948]: E1204 17:53:53.077574 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" podUID="f621ca2b-bd4b-41a2-b11a-985f094886b1" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.078325 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" event={"ID":"bf2472b9-2441-4c7b-9d50-928f8dc38c78","Type":"ContainerStarted","Data":"9b20d6fbd9bcb7114321cdec590fedf4f4570592f7c6f5f4077a59ada0525241"} Dec 04 17:53:53 crc kubenswrapper[4948]: E1204 17:53:53.080680 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" podUID="bf2472b9-2441-4c7b-9d50-928f8dc38c78" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.101504 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-xnn4n" podStartSLOduration=2.923672716 podStartE2EDuration="35.10147797s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:19.312211159 +0000 UTC m=+1610.673285561" lastFinishedPulling="2025-12-04 17:53:51.490016413 +0000 UTC m=+1642.851090815" observedRunningTime="2025-12-04 17:53:53.095333125 +0000 UTC m=+1644.456407527" watchObservedRunningTime="2025-12-04 17:53:53.10147797 +0000 UTC m=+1644.462552402" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.102495 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" event={"ID":"3c59027a-d806-4798-8338-a2ea5c9ba1ba","Type":"ContainerStarted","Data":"7b055a24e06a2964a79b9a730752ee9abd83b62ff7dab87cf175c5bf03f6caab"} Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.102770 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.110009 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" event={"ID":"e1c25561-350e-4093-8f84-17a631b22d36","Type":"ContainerStarted","Data":"95e8b3ea67b48385d90d379aab2ec8f562facf62170fe7e4471700f8653430ff"} Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.110705 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.114407 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" event={"ID":"98b7f9ee-20c5-4821-9841-c44b60650d4e","Type":"ContainerStarted","Data":"2658c3c3d6d9f59992226a347d26250efc4c1cb79aab20b33cdf8a7409335b90"} Dec 04 17:53:53 crc kubenswrapper[4948]: E1204 17:53:53.116750 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" podUID="98b7f9ee-20c5-4821-9841-c44b60650d4e" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.127689 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" event={"ID":"f5180167-c92a-4f8a-b924-ee9d6e080261","Type":"ContainerStarted","Data":"407fb2ef3d2d81aba504587845e03a5c9459fdd3198373cdab5e326afb6560ff"} Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.128557 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.129668 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" event={"ID":"c9f08351-bf2d-4272-a43e-c8770c413a7c","Type":"ContainerStarted","Data":"8a27a3647cb68f481939a9c01fe2cde7f50e87d2577dfa263ee9eaff0ed038d1"} Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.130064 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.133157 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" event={"ID":"d67b2a10-118c-4a3a-8cc8-a5dc33a92896","Type":"ContainerStarted","Data":"31421a8cc68a2fab347553e1439df9cfcf722741d8a42028ece34611d3e5b84b"} Dec 04 17:53:53 crc kubenswrapper[4948]: E1204 17:53:53.135748 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" podUID="d67b2a10-118c-4a3a-8cc8-a5dc33a92896" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.135940 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-89ndh" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.139179 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-hvw8z" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.139407 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-jlxw2" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.270752 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" podStartSLOduration=35.270726825 podStartE2EDuration="35.270726825s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:53:53.261784862 +0000 UTC m=+1644.622859264" watchObservedRunningTime="2025-12-04 17:53:53.270726825 +0000 UTC m=+1644.631801227" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.305990 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" podStartSLOduration=2.813722063 podStartE2EDuration="35.305968725s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.143028772 +0000 UTC m=+1611.504103174" lastFinishedPulling="2025-12-04 17:53:52.635275434 +0000 UTC m=+1643.996349836" observedRunningTime="2025-12-04 17:53:53.285171004 +0000 UTC m=+1644.646245416" watchObservedRunningTime="2025-12-04 17:53:53.305968725 +0000 UTC m=+1644.667043117" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.341822 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" podStartSLOduration=4.287925542 podStartE2EDuration="35.341798963s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.434283803 +0000 UTC m=+1611.795358205" lastFinishedPulling="2025-12-04 17:53:51.488157224 +0000 UTC m=+1642.849231626" observedRunningTime="2025-12-04 17:53:53.333402966 +0000 UTC m=+1644.694477368" watchObservedRunningTime="2025-12-04 17:53:53.341798963 +0000 UTC m=+1644.702873365" Dec 04 17:53:53 crc kubenswrapper[4948]: I1204 17:53:53.350625 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" podStartSLOduration=3.329000126 podStartE2EDuration="35.350606061s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:19.49428599 +0000 UTC m=+1610.855360392" lastFinishedPulling="2025-12-04 17:53:51.515891925 +0000 UTC m=+1642.876966327" observedRunningTime="2025-12-04 17:53:53.348731092 +0000 UTC m=+1644.709805494" watchObservedRunningTime="2025-12-04 17:53:53.350606061 +0000 UTC m=+1644.711680463" Dec 04 17:53:54 crc kubenswrapper[4948]: I1204 17:53:54.141934 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fvll5" Dec 04 17:53:54 crc kubenswrapper[4948]: E1204 17:53:54.143241 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" podUID="98b7f9ee-20c5-4821-9841-c44b60650d4e" Dec 04 17:53:54 crc kubenswrapper[4948]: E1204 17:53:54.143243 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" podUID="f621ca2b-bd4b-41a2-b11a-985f094886b1" Dec 04 17:53:54 crc kubenswrapper[4948]: I1204 17:53:54.145414 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-gb89j" Dec 04 17:53:54 crc kubenswrapper[4948]: I1204 17:53:54.146372 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-9dfcj" Dec 04 17:53:54 crc kubenswrapper[4948]: I1204 17:53:54.146482 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-2hntl" Dec 04 17:53:54 crc kubenswrapper[4948]: I1204 17:53:54.920001 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:53:54 crc kubenswrapper[4948]: E1204 17:53:54.920725 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.149174 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" event={"ID":"03cc6bc7-10ac-4521-9688-bbff0633f05a","Type":"ContainerStarted","Data":"d132d83ae06c2ef8954a20618230e5a05d724e81664fae89ca6a02c20cdd2e68"} Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.149219 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" event={"ID":"03cc6bc7-10ac-4521-9688-bbff0633f05a","Type":"ContainerStarted","Data":"6812596ae225ff1f4b2da743b7af847de8be0c7d90aad26d073a5dfa913819cc"} Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.149346 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.151372 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" event={"ID":"0ef37c1f-0fdf-43bd-81cf-4a359b671653","Type":"ContainerStarted","Data":"e6cd029f7563f2589fd458e79eb8b85d31e8fd02fd4a8244d74481b30a5a0a23"} Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.151426 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" event={"ID":"0ef37c1f-0fdf-43bd-81cf-4a359b671653","Type":"ContainerStarted","Data":"ec739813aab13ac04f66d614a010966ac3f3a96a728cca60d2847c6c6672a408"} Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.151531 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.153066 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" event={"ID":"d67b2a10-118c-4a3a-8cc8-a5dc33a92896","Type":"ContainerStarted","Data":"99a8818c3c3162275e391f06a2dfdc998eae515a2334d89ddd70d32dc460fe37"} Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.153251 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.154881 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" event={"ID":"bf2472b9-2441-4c7b-9d50-928f8dc38c78","Type":"ContainerStarted","Data":"4af55070d48434554c8fca8f199ac7081a1a58af434093a6aa6c2930ae43e681"} Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.197466 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" podStartSLOduration=2.511362519 podStartE2EDuration="37.197447134s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.152272775 +0000 UTC m=+1611.513347177" lastFinishedPulling="2025-12-04 17:53:54.83835739 +0000 UTC m=+1646.199431792" observedRunningTime="2025-12-04 17:53:55.193985014 +0000 UTC m=+1646.555059426" watchObservedRunningTime="2025-12-04 17:53:55.197447134 +0000 UTC m=+1646.558521536" Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.198253 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" podStartSLOduration=30.940872477 podStartE2EDuration="37.19824915s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:48.04453038 +0000 UTC m=+1639.405604802" lastFinishedPulling="2025-12-04 17:53:54.301907073 +0000 UTC m=+1645.662981475" observedRunningTime="2025-12-04 17:53:55.176600772 +0000 UTC m=+1646.537675194" watchObservedRunningTime="2025-12-04 17:53:55.19824915 +0000 UTC m=+1646.559323552" Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.232385 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" podStartSLOduration=34.518225257 podStartE2EDuration="37.232359253s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:51.59038362 +0000 UTC m=+1642.951458022" lastFinishedPulling="2025-12-04 17:53:54.304517616 +0000 UTC m=+1645.665592018" observedRunningTime="2025-12-04 17:53:55.225524036 +0000 UTC m=+1646.586598448" watchObservedRunningTime="2025-12-04 17:53:55.232359253 +0000 UTC m=+1646.593433675" Dec 04 17:53:55 crc kubenswrapper[4948]: I1204 17:53:55.250480 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" podStartSLOduration=2.1930273160000002 podStartE2EDuration="37.250456498s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:19.780376721 +0000 UTC m=+1611.141451123" lastFinishedPulling="2025-12-04 17:53:54.837805903 +0000 UTC m=+1646.198880305" observedRunningTime="2025-12-04 17:53:55.245001804 +0000 UTC m=+1646.606076216" watchObservedRunningTime="2025-12-04 17:53:55.250456498 +0000 UTC m=+1646.611530910" Dec 04 17:53:58 crc kubenswrapper[4948]: I1204 17:53:58.508720 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-4lgxs" Dec 04 17:53:58 crc kubenswrapper[4948]: I1204 17:53:58.657997 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-2wn6p" Dec 04 17:53:58 crc kubenswrapper[4948]: I1204 17:53:58.875782 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-tx6s4" Dec 04 17:53:58 crc kubenswrapper[4948]: I1204 17:53:58.895517 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" Dec 04 17:53:58 crc kubenswrapper[4948]: I1204 17:53:58.938148 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-ff5cf" Dec 04 17:53:59 crc kubenswrapper[4948]: I1204 17:53:59.021053 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-pxvwh" Dec 04 17:53:59 crc kubenswrapper[4948]: I1204 17:53:59.082580 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-zqk8k" Dec 04 17:53:59 crc kubenswrapper[4948]: I1204 17:53:59.331387 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-wrj57" Dec 04 17:54:00 crc kubenswrapper[4948]: I1204 17:54:00.815476 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp" Dec 04 17:54:01 crc kubenswrapper[4948]: I1204 17:54:01.190822 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-f65bcfbd6-ksq9b" Dec 04 17:54:04 crc kubenswrapper[4948]: I1204 17:54:04.602284 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-k8zrd" Dec 04 17:54:05 crc kubenswrapper[4948]: I1204 17:54:05.915744 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:54:05 crc kubenswrapper[4948]: E1204 17:54:05.916210 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:54:05 crc kubenswrapper[4948]: I1204 17:54:05.919304 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 17:54:08 crc kubenswrapper[4948]: I1204 17:54:08.268113 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" event={"ID":"f621ca2b-bd4b-41a2-b11a-985f094886b1","Type":"ContainerStarted","Data":"e737c3017a850e5d98d6a75648ef4971035bbf9c7acd5a3343d681d8dad3ff7e"} Dec 04 17:54:08 crc kubenswrapper[4948]: I1204 17:54:08.268901 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" Dec 04 17:54:08 crc kubenswrapper[4948]: I1204 17:54:08.271722 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" event={"ID":"98b7f9ee-20c5-4821-9841-c44b60650d4e","Type":"ContainerStarted","Data":"2773cab54931ecf7bccfe9d41153a69cac15d03ab4f3ed4bf841ccfa218676a2"} Dec 04 17:54:08 crc kubenswrapper[4948]: I1204 17:54:08.272002 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" Dec 04 17:54:08 crc kubenswrapper[4948]: I1204 17:54:08.301069 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" podStartSLOduration=2.983975478 podStartE2EDuration="50.301028442s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:20.316751479 +0000 UTC m=+1611.677825881" lastFinishedPulling="2025-12-04 17:54:07.633804413 +0000 UTC m=+1658.994878845" observedRunningTime="2025-12-04 17:54:08.294509865 +0000 UTC m=+1659.655584287" watchObservedRunningTime="2025-12-04 17:54:08.301028442 +0000 UTC m=+1659.662102854" Dec 04 17:54:08 crc kubenswrapper[4948]: I1204 17:54:08.319088 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" podStartSLOduration=1.7933155219999999 podStartE2EDuration="50.319062585s" podCreationTimestamp="2025-12-04 17:53:18 +0000 UTC" firstStartedPulling="2025-12-04 17:53:19.451386173 +0000 UTC m=+1610.812460575" lastFinishedPulling="2025-12-04 17:54:07.977133236 +0000 UTC m=+1659.338207638" observedRunningTime="2025-12-04 17:54:08.312506667 +0000 UTC m=+1659.673581069" watchObservedRunningTime="2025-12-04 17:54:08.319062585 +0000 UTC m=+1659.680136997" Dec 04 17:54:08 crc kubenswrapper[4948]: I1204 17:54:08.846412 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-pvwlh" Dec 04 17:54:08 crc kubenswrapper[4948]: I1204 17:54:08.897921 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-5g87l" Dec 04 17:54:18 crc kubenswrapper[4948]: I1204 17:54:18.758790 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-p7mxr" Dec 04 17:54:18 crc kubenswrapper[4948]: I1204 17:54:18.891018 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p5t6c" Dec 04 17:54:19 crc kubenswrapper[4948]: I1204 17:54:19.914292 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:54:19 crc kubenswrapper[4948]: E1204 17:54:19.914694 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:54:33 crc kubenswrapper[4948]: I1204 17:54:33.914519 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:54:33 crc kubenswrapper[4948]: E1204 17:54:33.915554 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.295655 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dk6wg"] Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.297478 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.299332 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.299532 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.299684 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jnzpn" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.299904 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.317213 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dk6wg"] Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.361565 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jdgf4"] Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.365734 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.366233 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-config\") pod \"dnsmasq-dns-675f4bcbfc-dk6wg\" (UID: \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.366337 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdp5d\" (UniqueName: \"kubernetes.io/projected/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-kube-api-access-fdp5d\") pod \"dnsmasq-dns-675f4bcbfc-dk6wg\" (UID: \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.368174 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.378028 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jdgf4"] Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.466762 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.466830 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdp5d\" (UniqueName: \"kubernetes.io/projected/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-kube-api-access-fdp5d\") pod \"dnsmasq-dns-675f4bcbfc-dk6wg\" (UID: \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.466854 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7pzz\" (UniqueName: \"kubernetes.io/projected/60f1f62a-9f68-48c0-bd54-7b1944a34632-kube-api-access-t7pzz\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.466884 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-config\") pod \"dnsmasq-dns-675f4bcbfc-dk6wg\" (UID: \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.466900 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-config\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.467808 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-config\") pod \"dnsmasq-dns-675f4bcbfc-dk6wg\" (UID: \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.489712 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdp5d\" (UniqueName: \"kubernetes.io/projected/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-kube-api-access-fdp5d\") pod \"dnsmasq-dns-675f4bcbfc-dk6wg\" (UID: \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.567595 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.567663 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7pzz\" (UniqueName: \"kubernetes.io/projected/60f1f62a-9f68-48c0-bd54-7b1944a34632-kube-api-access-t7pzz\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.567692 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-config\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.568584 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.568617 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-config\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.599703 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7pzz\" (UniqueName: \"kubernetes.io/projected/60f1f62a-9f68-48c0-bd54-7b1944a34632-kube-api-access-t7pzz\") pod \"dnsmasq-dns-78dd6ddcc-jdgf4\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.617687 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:54:36 crc kubenswrapper[4948]: I1204 17:54:36.679379 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.020706 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dk6wg"] Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.211791 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jdgf4"] Dec 04 17:54:37 crc kubenswrapper[4948]: W1204 17:54:37.215704 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60f1f62a_9f68_48c0_bd54_7b1944a34632.slice/crio-4507c376795721385040788c4a1dd491de6c4144fced54245c872fab09d78e7a WatchSource:0}: Error finding container 4507c376795721385040788c4a1dd491de6c4144fced54245c872fab09d78e7a: Status 404 returned error can't find the container with id 4507c376795721385040788c4a1dd491de6c4144fced54245c872fab09d78e7a Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.498985 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" event={"ID":"1f417cb3-bc47-4734-a6b0-d888c04e9c8b","Type":"ContainerStarted","Data":"e78f68848a17db1e6cc100599217bc52c9650d4640589623e4b8784dd9af4fbd"} Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.500121 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" event={"ID":"60f1f62a-9f68-48c0-bd54-7b1944a34632","Type":"ContainerStarted","Data":"4507c376795721385040788c4a1dd491de6c4144fced54245c872fab09d78e7a"} Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.816993 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dk6wg"] Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.847317 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-llhqq"] Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.848393 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.856641 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-llhqq"] Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.986454 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-config\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.986513 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mht69\" (UniqueName: \"kubernetes.io/projected/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-kube-api-access-mht69\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:37 crc kubenswrapper[4948]: I1204 17:54:37.986545 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-dns-svc\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.088088 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-config\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.088154 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mht69\" (UniqueName: \"kubernetes.io/projected/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-kube-api-access-mht69\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.088187 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-dns-svc\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.088910 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-config\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.089366 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-dns-svc\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.110142 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mht69\" (UniqueName: \"kubernetes.io/projected/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-kube-api-access-mht69\") pod \"dnsmasq-dns-666b6646f7-llhqq\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.166540 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.537397 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jdgf4"] Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.580445 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qlssc"] Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.582172 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.597431 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xbtn\" (UniqueName: \"kubernetes.io/projected/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-kube-api-access-2xbtn\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.597531 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.597568 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-config\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.603524 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qlssc"] Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.673893 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-llhqq"] Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.698178 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.698840 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-config\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.698997 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xbtn\" (UniqueName: \"kubernetes.io/projected/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-kube-api-access-2xbtn\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.699701 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.699829 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-config\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.739303 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xbtn\" (UniqueName: \"kubernetes.io/projected/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-kube-api-access-2xbtn\") pod \"dnsmasq-dns-57d769cc4f-qlssc\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:38 crc kubenswrapper[4948]: I1204 17:54:38.915484 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.390560 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.396056 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.399461 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xxq89" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.399788 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.400219 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.401670 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.403574 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.403830 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.403965 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.404116 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.432827 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qlssc"] Dec 04 17:54:39 crc kubenswrapper[4948]: W1204 17:54:39.439647 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef332ce3_f50d_49f9_a786_1d656f9bdf7d.slice/crio-96e9b630c73ad15fcf7209ac9741609b6ceb5d177561b09c9421835174464568 WatchSource:0}: Error finding container 96e9b630c73ad15fcf7209ac9741609b6ceb5d177561b09c9421835174464568: Status 404 returned error can't find the container with id 96e9b630c73ad15fcf7209ac9741609b6ceb5d177561b09c9421835174464568 Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.512238 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.512525 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90b4baf7-8366-4f47-8515-c33e1b691856-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.512892 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.513012 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.513049 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90b4baf7-8366-4f47-8515-c33e1b691856-pod-info\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.513069 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hm6h\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-kube-api-access-4hm6h\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.513097 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-server-conf\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.513199 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.513238 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.513282 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.513313 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.521030 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" event={"ID":"8709b04e-a9d6-4d38-a0e7-dcc4e226be53","Type":"ContainerStarted","Data":"ba8dd40995df63d26861e15f2328ad104f87e91000e15a6dbedbcc7dc5f4a62f"} Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.522424 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" event={"ID":"ef332ce3-f50d-49f9-a786-1d656f9bdf7d","Type":"ContainerStarted","Data":"96e9b630c73ad15fcf7209ac9741609b6ceb5d177561b09c9421835174464568"} Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614231 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90b4baf7-8366-4f47-8515-c33e1b691856-pod-info\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614284 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hm6h\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-kube-api-access-4hm6h\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614316 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-server-conf\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614340 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614357 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614397 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614413 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614447 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614466 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90b4baf7-8366-4f47-8515-c33e1b691856-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614491 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.614509 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.615023 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.615569 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.615686 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.616455 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.616570 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.617962 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-server-conf\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.620620 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.621308 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.621343 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90b4baf7-8366-4f47-8515-c33e1b691856-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.621358 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90b4baf7-8366-4f47-8515-c33e1b691856-pod-info\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.629195 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hm6h\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-kube-api-access-4hm6h\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.637170 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.692325 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.695438 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.698816 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.699163 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.699316 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ffwb5" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.701233 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.701657 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.702111 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.704605 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.712628 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.727345 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818712 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818763 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818794 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b34ca165-31d6-44fa-b175-ed2b1bf9f766-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818831 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818864 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcf5l\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-kube-api-access-hcf5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818890 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818918 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818951 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b34ca165-31d6-44fa-b175-ed2b1bf9f766-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.818985 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.819017 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.819085 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.920958 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921018 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921073 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921105 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921123 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921145 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b34ca165-31d6-44fa-b175-ed2b1bf9f766-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921176 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921205 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcf5l\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-kube-api-access-hcf5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921229 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921251 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.921283 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b34ca165-31d6-44fa-b175-ed2b1bf9f766-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.922117 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.923544 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.923698 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.923812 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.923909 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.924753 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.926113 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b34ca165-31d6-44fa-b175-ed2b1bf9f766-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.926412 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b34ca165-31d6-44fa-b175-ed2b1bf9f766-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.941889 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.944574 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:39 crc kubenswrapper[4948]: I1204 17:54:39.953231 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcf5l\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-kube-api-access-hcf5l\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:40 crc kubenswrapper[4948]: I1204 17:54:40.015483 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:40 crc kubenswrapper[4948]: I1204 17:54:40.027000 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:54:40 crc kubenswrapper[4948]: I1204 17:54:40.247336 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 17:54:40 crc kubenswrapper[4948]: W1204 17:54:40.253421 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b4baf7_8366_4f47_8515_c33e1b691856.slice/crio-7a8effce90210822af5a088861f11228c36d6225a004f984f8933d2fc185cf2d WatchSource:0}: Error finding container 7a8effce90210822af5a088861f11228c36d6225a004f984f8933d2fc185cf2d: Status 404 returned error can't find the container with id 7a8effce90210822af5a088861f11228c36d6225a004f984f8933d2fc185cf2d Dec 04 17:54:40 crc kubenswrapper[4948]: I1204 17:54:40.532324 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"90b4baf7-8366-4f47-8515-c33e1b691856","Type":"ContainerStarted","Data":"7a8effce90210822af5a088861f11228c36d6225a004f984f8933d2fc185cf2d"} Dec 04 17:54:40 crc kubenswrapper[4948]: I1204 17:54:40.532505 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 17:54:40 crc kubenswrapper[4948]: W1204 17:54:40.541139 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb34ca165_31d6_44fa_b175_ed2b1bf9f766.slice/crio-620df2ab73d787ca0a2318f901ef4e5e794bd6131973c0c9fb9775dc461704e7 WatchSource:0}: Error finding container 620df2ab73d787ca0a2318f901ef4e5e794bd6131973c0c9fb9775dc461704e7: Status 404 returned error can't find the container with id 620df2ab73d787ca0a2318f901ef4e5e794bd6131973c0c9fb9775dc461704e7 Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.219662 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.225475 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.229789 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.230300 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.230431 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.230659 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-9d9sk" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.236338 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.238017 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.378546 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.378715 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.378749 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs7sv\" (UniqueName: \"kubernetes.io/projected/27244fac-7ff8-4ca0-9002-ef85f78a2564-kube-api-access-gs7sv\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.378803 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.378832 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.378867 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.378896 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.378945 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482098 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482210 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482232 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs7sv\" (UniqueName: \"kubernetes.io/projected/27244fac-7ff8-4ca0-9002-ef85f78a2564-kube-api-access-gs7sv\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482291 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482314 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482638 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482338 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482866 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.482894 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.484884 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.485871 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.486215 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.490283 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.492433 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.498321 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs7sv\" (UniqueName: \"kubernetes.io/projected/27244fac-7ff8-4ca0-9002-ef85f78a2564-kube-api-access-gs7sv\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.514342 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.534327 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " pod="openstack/openstack-galera-0" Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.541493 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b34ca165-31d6-44fa-b175-ed2b1bf9f766","Type":"ContainerStarted","Data":"620df2ab73d787ca0a2318f901ef4e5e794bd6131973c0c9fb9775dc461704e7"} Dec 04 17:54:41 crc kubenswrapper[4948]: I1204 17:54:41.552158 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.150455 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.553686 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27244fac-7ff8-4ca0-9002-ef85f78a2564","Type":"ContainerStarted","Data":"a8c0af5aae132e2544ef791c098e1d413a5a8ae119ff721087d8a2f7969a5c88"} Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.630555 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.641969 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.646300 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.646935 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.647225 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.647388 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rdz44" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.654432 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.772741 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.774183 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.782202 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-h9md6" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.782901 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.783302 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.788023 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.822455 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.822507 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.822545 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.822585 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.822610 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.822626 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.822689 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rc78\" (UniqueName: \"kubernetes.io/projected/10997b06-2476-4c6c-865d-1e5927e75fac-kube-api-access-7rc78\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.822726 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.924786 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rc78\" (UniqueName: \"kubernetes.io/projected/10997b06-2476-4c6c-865d-1e5927e75fac-kube-api-access-7rc78\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.924832 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-kolla-config\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.924855 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n55k7\" (UniqueName: \"kubernetes.io/projected/fce6fe82-2dcb-49cd-851a-446e66038965-kube-api-access-n55k7\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.924879 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.924913 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.924934 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.924966 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.925000 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.925027 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.926333 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.926358 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.926378 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-config-data\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.926403 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.928141 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.928489 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.937577 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.938828 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.939971 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.940351 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.953974 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rc78\" (UniqueName: \"kubernetes.io/projected/10997b06-2476-4c6c-865d-1e5927e75fac-kube-api-access-7rc78\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.961776 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:42 crc kubenswrapper[4948]: I1204 17:54:42.988846 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.028132 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-config-data\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.028199 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.028236 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-kolla-config\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.028260 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n55k7\" (UniqueName: \"kubernetes.io/projected/fce6fe82-2dcb-49cd-851a-446e66038965-kube-api-access-n55k7\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.028375 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.029734 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-kolla-config\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.036537 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-combined-ca-bundle\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.048404 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-config-data\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.048882 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n55k7\" (UniqueName: \"kubernetes.io/projected/fce6fe82-2dcb-49cd-851a-446e66038965-kube-api-access-n55k7\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.058740 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-memcached-tls-certs\") pod \"memcached-0\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.105313 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.270877 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 17:54:43 crc kubenswrapper[4948]: I1204 17:54:43.640716 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 17:54:44 crc kubenswrapper[4948]: I1204 17:54:44.423548 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:54:44 crc kubenswrapper[4948]: I1204 17:54:44.424735 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 17:54:44 crc kubenswrapper[4948]: I1204 17:54:44.428136 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gntvp" Dec 04 17:54:44 crc kubenswrapper[4948]: I1204 17:54:44.446893 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:54:44 crc kubenswrapper[4948]: I1204 17:54:44.552488 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whc4z\" (UniqueName: \"kubernetes.io/projected/8e901b75-2ae5-4a6a-b958-4c924edc4189-kube-api-access-whc4z\") pod \"kube-state-metrics-0\" (UID: \"8e901b75-2ae5-4a6a-b958-4c924edc4189\") " pod="openstack/kube-state-metrics-0" Dec 04 17:54:44 crc kubenswrapper[4948]: I1204 17:54:44.654353 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whc4z\" (UniqueName: \"kubernetes.io/projected/8e901b75-2ae5-4a6a-b958-4c924edc4189-kube-api-access-whc4z\") pod \"kube-state-metrics-0\" (UID: \"8e901b75-2ae5-4a6a-b958-4c924edc4189\") " pod="openstack/kube-state-metrics-0" Dec 04 17:54:44 crc kubenswrapper[4948]: I1204 17:54:44.672816 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whc4z\" (UniqueName: \"kubernetes.io/projected/8e901b75-2ae5-4a6a-b958-4c924edc4189-kube-api-access-whc4z\") pod \"kube-state-metrics-0\" (UID: \"8e901b75-2ae5-4a6a-b958-4c924edc4189\") " pod="openstack/kube-state-metrics-0" Dec 04 17:54:44 crc kubenswrapper[4948]: I1204 17:54:44.760943 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 17:54:46 crc kubenswrapper[4948]: I1204 17:54:46.914738 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:54:46 crc kubenswrapper[4948]: E1204 17:54:46.915228 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.405959 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bd2ch"] Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.407243 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.413108 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jn97w" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.413161 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.413111 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.430784 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bd2ch"] Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.467955 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rzjh8"] Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.469484 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.477218 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rzjh8"] Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.517205 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-scripts\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.517442 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.517507 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-ovn-controller-tls-certs\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.517620 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5p2x\" (UniqueName: \"kubernetes.io/projected/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-kube-api-access-v5p2x\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.517674 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-log-ovn\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.517800 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run-ovn\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.517855 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-combined-ca-bundle\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619017 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run-ovn\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619128 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-combined-ca-bundle\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619149 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619168 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-scripts\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619191 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-log\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619219 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-scripts\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619244 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-etc-ovs\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619288 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmjx2\" (UniqueName: \"kubernetes.io/projected/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-kube-api-access-pmjx2\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619401 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-ovn-controller-tls-certs\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619505 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5p2x\" (UniqueName: \"kubernetes.io/projected/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-kube-api-access-v5p2x\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619537 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-run\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619572 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-log-ovn\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.619630 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-lib\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.620152 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-log-ovn\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.620203 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.620337 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run-ovn\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.621215 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-scripts\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.632775 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-ovn-controller-tls-certs\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.632975 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-combined-ca-bundle\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.637413 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5p2x\" (UniqueName: \"kubernetes.io/projected/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-kube-api-access-v5p2x\") pod \"ovn-controller-bd2ch\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.685831 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.687824 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.691004 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.691371 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-j4v9n" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.691492 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.691537 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.692194 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.702900 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724279 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-log\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724340 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-scripts\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724366 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-etc-ovs\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724395 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmjx2\" (UniqueName: \"kubernetes.io/projected/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-kube-api-access-pmjx2\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724449 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-run\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724479 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-lib\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724557 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-log\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724780 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-lib\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724789 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-etc-ovs\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.724846 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-run\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.726168 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bd2ch" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.731011 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-scripts\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.748294 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmjx2\" (UniqueName: \"kubernetes.io/projected/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-kube-api-access-pmjx2\") pod \"ovn-controller-ovs-rzjh8\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.788205 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.827703 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.827774 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.827817 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.827850 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrv9\" (UniqueName: \"kubernetes.io/projected/0cc3ac35-04df-4516-8623-b6a0d855c98a-kube-api-access-2mrv9\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.827891 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.827946 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.828062 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.828178 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-config\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.929857 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-config\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.929958 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.929998 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.930043 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.930099 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrv9\" (UniqueName: \"kubernetes.io/projected/0cc3ac35-04df-4516-8623-b6a0d855c98a-kube-api-access-2mrv9\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.930141 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.930168 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.930288 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.930910 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.931223 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.931670 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-config\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.932416 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.934634 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.939111 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.939244 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.949872 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrv9\" (UniqueName: \"kubernetes.io/projected/0cc3ac35-04df-4516-8623-b6a0d855c98a-kube-api-access-2mrv9\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:48 crc kubenswrapper[4948]: I1204 17:54:48.953163 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:49 crc kubenswrapper[4948]: I1204 17:54:49.028114 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.146488 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.148151 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.151085 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.152707 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.152968 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-drzwh" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.153210 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.167721 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.299376 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.299666 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.299866 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.300014 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd7sf\" (UniqueName: \"kubernetes.io/projected/6840a402-94d3-48e6-9ccb-d578573e430a-kube-api-access-sd7sf\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.300167 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.300333 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-config\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.300484 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.300651 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.402049 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.402168 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.402222 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.402245 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd7sf\" (UniqueName: \"kubernetes.io/projected/6840a402-94d3-48e6-9ccb-d578573e430a-kube-api-access-sd7sf\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.402583 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.402694 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.402758 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-config\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.403808 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.403923 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-config\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.402933 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.404032 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.404443 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.409994 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.413196 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.417196 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.438972 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.442398 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd7sf\" (UniqueName: \"kubernetes.io/projected/6840a402-94d3-48e6-9ccb-d578573e430a-kube-api-access-sd7sf\") pod \"ovsdbserver-sb-0\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:52 crc kubenswrapper[4948]: I1204 17:54:52.475767 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 17:54:58 crc kubenswrapper[4948]: W1204 17:54:58.643816 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfce6fe82_2dcb_49cd_851a_446e66038965.slice/crio-e2d81f333d32bce4bcb749b5b7cbf4de2ffaf7cb31712c52e8436d033d22cddd WatchSource:0}: Error finding container e2d81f333d32bce4bcb749b5b7cbf4de2ffaf7cb31712c52e8436d033d22cddd: Status 404 returned error can't find the container with id e2d81f333d32bce4bcb749b5b7cbf4de2ffaf7cb31712c52e8436d033d22cddd Dec 04 17:54:58 crc kubenswrapper[4948]: I1204 17:54:58.688303 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fce6fe82-2dcb-49cd-851a-446e66038965","Type":"ContainerStarted","Data":"e2d81f333d32bce4bcb749b5b7cbf4de2ffaf7cb31712c52e8436d033d22cddd"} Dec 04 17:55:00 crc kubenswrapper[4948]: E1204 17:55:00.172442 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 04 17:55:00 crc kubenswrapper[4948]: E1204 17:55:00.172658 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4hm6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(90b4baf7-8366-4f47-8515-c33e1b691856): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:55:00 crc kubenswrapper[4948]: E1204 17:55:00.173866 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" Dec 04 17:55:00 crc kubenswrapper[4948]: E1204 17:55:00.181083 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 04 17:55:00 crc kubenswrapper[4948]: E1204 17:55:00.181283 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hcf5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(b34ca165-31d6-44fa-b175-ed2b1bf9f766): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:55:00 crc kubenswrapper[4948]: E1204 17:55:00.182466 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" Dec 04 17:55:00 crc kubenswrapper[4948]: E1204 17:55:00.704355 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" Dec 04 17:55:00 crc kubenswrapper[4948]: E1204 17:55:00.704576 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" Dec 04 17:55:01 crc kubenswrapper[4948]: I1204 17:55:01.914182 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:55:01 crc kubenswrapper[4948]: E1204 17:55:01.914828 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:55:08 crc kubenswrapper[4948]: I1204 17:55:08.179925 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 17:55:08 crc kubenswrapper[4948]: I1204 17:55:08.276675 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bd2ch"] Dec 04 17:55:08 crc kubenswrapper[4948]: I1204 17:55:08.283242 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:55:08 crc kubenswrapper[4948]: I1204 17:55:08.318928 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.655026 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.655616 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdp5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-dk6wg_openstack(1f417cb3-bc47-4734-a6b0-d888c04e9c8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.655853 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.655930 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t7pzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-jdgf4_openstack(60f1f62a-9f68-48c0-bd54-7b1944a34632): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.658109 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" podUID="1f417cb3-bc47-4734-a6b0-d888c04e9c8b" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.658705 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" podUID="60f1f62a-9f68-48c0-bd54-7b1944a34632" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.667123 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.667238 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mht69,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-llhqq_openstack(8709b04e-a9d6-4d38-a0e7-dcc4e226be53): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.668361 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.715676 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.715861 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xbtn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-qlssc_openstack(ef332ce3-f50d-49f9-a786-1d656f9bdf7d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.717273 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" Dec 04 17:55:08 crc kubenswrapper[4948]: I1204 17:55:08.764233 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e901b75-2ae5-4a6a-b958-4c924edc4189","Type":"ContainerStarted","Data":"2c5f8b3adf7be13cc91b7d0a19176dd2cdc680846db75ed5aa265cf70a20cbc9"} Dec 04 17:55:08 crc kubenswrapper[4948]: I1204 17:55:08.766477 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bd2ch" event={"ID":"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1","Type":"ContainerStarted","Data":"80e6d672daf9de3de1ae6621a0dc3ac67417142cd01056d8ce7168d193af2efb"} Dec 04 17:55:08 crc kubenswrapper[4948]: I1204 17:55:08.768416 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"10997b06-2476-4c6c-865d-1e5927e75fac","Type":"ContainerStarted","Data":"2ddc3499f14dc2b4545cd0e2d863bced0a2ebfc46c6eef3fc72c288c305cd155"} Dec 04 17:55:08 crc kubenswrapper[4948]: I1204 17:55:08.769598 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6840a402-94d3-48e6-9ccb-d578573e430a","Type":"ContainerStarted","Data":"fc2ebaecc87fa15fe23f8b47df9c3ae931281d7dcb5e5c98d32e936e8f1b9a17"} Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.773747 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" Dec 04 17:55:08 crc kubenswrapper[4948]: E1204 17:55:08.773958 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.029665 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rzjh8"] Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.173931 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.176605 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.230270 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7pzz\" (UniqueName: \"kubernetes.io/projected/60f1f62a-9f68-48c0-bd54-7b1944a34632-kube-api-access-t7pzz\") pod \"60f1f62a-9f68-48c0-bd54-7b1944a34632\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.230358 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-config\") pod \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\" (UID: \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\") " Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.230491 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-config\") pod \"60f1f62a-9f68-48c0-bd54-7b1944a34632\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.230551 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdp5d\" (UniqueName: \"kubernetes.io/projected/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-kube-api-access-fdp5d\") pod \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\" (UID: \"1f417cb3-bc47-4734-a6b0-d888c04e9c8b\") " Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.230582 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-dns-svc\") pod \"60f1f62a-9f68-48c0-bd54-7b1944a34632\" (UID: \"60f1f62a-9f68-48c0-bd54-7b1944a34632\") " Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.231394 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60f1f62a-9f68-48c0-bd54-7b1944a34632" (UID: "60f1f62a-9f68-48c0-bd54-7b1944a34632"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.232424 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-config" (OuterVolumeSpecName: "config") pod "60f1f62a-9f68-48c0-bd54-7b1944a34632" (UID: "60f1f62a-9f68-48c0-bd54-7b1944a34632"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.232483 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-config" (OuterVolumeSpecName: "config") pod "1f417cb3-bc47-4734-a6b0-d888c04e9c8b" (UID: "1f417cb3-bc47-4734-a6b0-d888c04e9c8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.236617 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f1f62a-9f68-48c0-bd54-7b1944a34632-kube-api-access-t7pzz" (OuterVolumeSpecName: "kube-api-access-t7pzz") pod "60f1f62a-9f68-48c0-bd54-7b1944a34632" (UID: "60f1f62a-9f68-48c0-bd54-7b1944a34632"). InnerVolumeSpecName "kube-api-access-t7pzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.237965 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-kube-api-access-fdp5d" (OuterVolumeSpecName: "kube-api-access-fdp5d") pod "1f417cb3-bc47-4734-a6b0-d888c04e9c8b" (UID: "1f417cb3-bc47-4734-a6b0-d888c04e9c8b"). InnerVolumeSpecName "kube-api-access-fdp5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.279179 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.331936 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7pzz\" (UniqueName: \"kubernetes.io/projected/60f1f62a-9f68-48c0-bd54-7b1944a34632-kube-api-access-t7pzz\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.331975 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.331990 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.332000 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdp5d\" (UniqueName: \"kubernetes.io/projected/1f417cb3-bc47-4734-a6b0-d888c04e9c8b-kube-api-access-fdp5d\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.332008 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60f1f62a-9f68-48c0-bd54-7b1944a34632-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:09 crc kubenswrapper[4948]: W1204 17:55:09.382204 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cc3ac35_04df_4516_8623_b6a0d855c98a.slice/crio-74bd34b58b9a2c2f329ec31a6499e5e08fd2b9554eb628f3b488116ce2c47d7d WatchSource:0}: Error finding container 74bd34b58b9a2c2f329ec31a6499e5e08fd2b9554eb628f3b488116ce2c47d7d: Status 404 returned error can't find the container with id 74bd34b58b9a2c2f329ec31a6499e5e08fd2b9554eb628f3b488116ce2c47d7d Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.779899 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27244fac-7ff8-4ca0-9002-ef85f78a2564","Type":"ContainerStarted","Data":"5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db"} Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.782985 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rzjh8" event={"ID":"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d","Type":"ContainerStarted","Data":"ad1e99440685698ea5d9356335de4b56828cb0bac1f1fd589e958d7c34c024d5"} Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.785104 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fce6fe82-2dcb-49cd-851a-446e66038965","Type":"ContainerStarted","Data":"ca9cbacbe11dab0d67e4befee314eff5c109090f17e6413c4b39e087ea4c3f46"} Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.785269 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.786267 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" event={"ID":"60f1f62a-9f68-48c0-bd54-7b1944a34632","Type":"ContainerDied","Data":"4507c376795721385040788c4a1dd491de6c4144fced54245c872fab09d78e7a"} Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.786337 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-jdgf4" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.787126 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" event={"ID":"1f417cb3-bc47-4734-a6b0-d888c04e9c8b","Type":"ContainerDied","Data":"e78f68848a17db1e6cc100599217bc52c9650d4640589623e4b8784dd9af4fbd"} Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.787174 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dk6wg" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.797946 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cc3ac35-04df-4516-8623-b6a0d855c98a","Type":"ContainerStarted","Data":"74bd34b58b9a2c2f329ec31a6499e5e08fd2b9554eb628f3b488116ce2c47d7d"} Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.809031 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"10997b06-2476-4c6c-865d-1e5927e75fac","Type":"ContainerStarted","Data":"e8ee824e9c19c8047c179bbf5ecf993388e37bf34793e8731035445c781ff3b5"} Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.826915 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.853150883 podStartE2EDuration="27.826892892s" podCreationTimestamp="2025-12-04 17:54:42 +0000 UTC" firstStartedPulling="2025-12-04 17:54:58.648330279 +0000 UTC m=+1710.009404711" lastFinishedPulling="2025-12-04 17:55:08.622072278 +0000 UTC m=+1719.983146720" observedRunningTime="2025-12-04 17:55:09.821873132 +0000 UTC m=+1721.182947544" watchObservedRunningTime="2025-12-04 17:55:09.826892892 +0000 UTC m=+1721.187967294" Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.882385 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jdgf4"] Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.887577 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-jdgf4"] Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.920684 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dk6wg"] Dec 04 17:55:09 crc kubenswrapper[4948]: I1204 17:55:09.928621 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dk6wg"] Dec 04 17:55:10 crc kubenswrapper[4948]: I1204 17:55:10.923470 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f417cb3-bc47-4734-a6b0-d888c04e9c8b" path="/var/lib/kubelet/pods/1f417cb3-bc47-4734-a6b0-d888c04e9c8b/volumes" Dec 04 17:55:10 crc kubenswrapper[4948]: I1204 17:55:10.925231 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f1f62a-9f68-48c0-bd54-7b1944a34632" path="/var/lib/kubelet/pods/60f1f62a-9f68-48c0-bd54-7b1944a34632/volumes" Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.849612 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e901b75-2ae5-4a6a-b958-4c924edc4189","Type":"ContainerStarted","Data":"4ac93f5a77955edb05b13ac37f9560428bc365628cca00e1159d8b0b5f24352d"} Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.850333 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.851787 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bd2ch" event={"ID":"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1","Type":"ContainerStarted","Data":"fa492590ec2bee6d8477d4e723bea4207fb5e920790c621ae08dfef6fec716cc"} Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.851830 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-bd2ch" Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.853935 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cc3ac35-04df-4516-8623-b6a0d855c98a","Type":"ContainerStarted","Data":"0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544"} Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.856169 4948 generic.go:334] "Generic (PLEG): container finished" podID="10997b06-2476-4c6c-865d-1e5927e75fac" containerID="e8ee824e9c19c8047c179bbf5ecf993388e37bf34793e8731035445c781ff3b5" exitCode=0 Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.856285 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"10997b06-2476-4c6c-865d-1e5927e75fac","Type":"ContainerDied","Data":"e8ee824e9c19c8047c179bbf5ecf993388e37bf34793e8731035445c781ff3b5"} Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.857597 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6840a402-94d3-48e6-9ccb-d578573e430a","Type":"ContainerStarted","Data":"568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c"} Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.858804 4948 generic.go:334] "Generic (PLEG): container finished" podID="27244fac-7ff8-4ca0-9002-ef85f78a2564" containerID="5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db" exitCode=0 Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.858847 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27244fac-7ff8-4ca0-9002-ef85f78a2564","Type":"ContainerDied","Data":"5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db"} Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.860239 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rzjh8" event={"ID":"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d","Type":"ContainerStarted","Data":"d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02"} Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.869336 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=25.445976234 podStartE2EDuration="30.869312721s" podCreationTimestamp="2025-12-04 17:54:44 +0000 UTC" firstStartedPulling="2025-12-04 17:55:08.626608322 +0000 UTC m=+1719.987682724" lastFinishedPulling="2025-12-04 17:55:14.049944809 +0000 UTC m=+1725.411019211" observedRunningTime="2025-12-04 17:55:14.866567964 +0000 UTC m=+1726.227642396" watchObservedRunningTime="2025-12-04 17:55:14.869312721 +0000 UTC m=+1726.230387123" Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.913780 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:55:14 crc kubenswrapper[4948]: E1204 17:55:14.914228 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:55:14 crc kubenswrapper[4948]: I1204 17:55:14.949701 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-bd2ch" podStartSLOduration=21.545308478 podStartE2EDuration="26.949684233s" podCreationTimestamp="2025-12-04 17:54:48 +0000 UTC" firstStartedPulling="2025-12-04 17:55:08.632874261 +0000 UTC m=+1719.993948663" lastFinishedPulling="2025-12-04 17:55:14.037250006 +0000 UTC m=+1725.398324418" observedRunningTime="2025-12-04 17:55:14.945823021 +0000 UTC m=+1726.306897423" watchObservedRunningTime="2025-12-04 17:55:14.949684233 +0000 UTC m=+1726.310758635" Dec 04 17:55:15 crc kubenswrapper[4948]: I1204 17:55:15.868087 4948 generic.go:334] "Generic (PLEG): container finished" podID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerID="d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02" exitCode=0 Dec 04 17:55:15 crc kubenswrapper[4948]: I1204 17:55:15.868144 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rzjh8" event={"ID":"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d","Type":"ContainerDied","Data":"d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02"} Dec 04 17:55:15 crc kubenswrapper[4948]: I1204 17:55:15.869887 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"90b4baf7-8366-4f47-8515-c33e1b691856","Type":"ContainerStarted","Data":"82c901cf00202ab9ecd08dc4c09ede1d9fcdc9869bb58784238db83a9b10208f"} Dec 04 17:55:15 crc kubenswrapper[4948]: I1204 17:55:15.878622 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b34ca165-31d6-44fa-b175-ed2b1bf9f766","Type":"ContainerStarted","Data":"0faca2eff6b0bcf6f0f9c1e986baf52aab23458cefa4976735633696f679414d"} Dec 04 17:55:15 crc kubenswrapper[4948]: I1204 17:55:15.884554 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"10997b06-2476-4c6c-865d-1e5927e75fac","Type":"ContainerStarted","Data":"c6364a91b688011f494239085545a963704364e17364c1672c50b66b56b55484"} Dec 04 17:55:15 crc kubenswrapper[4948]: I1204 17:55:15.891091 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27244fac-7ff8-4ca0-9002-ef85f78a2564","Type":"ContainerStarted","Data":"6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163"} Dec 04 17:55:15 crc kubenswrapper[4948]: I1204 17:55:15.958191 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=34.958172971 podStartE2EDuration="34.958172971s" podCreationTimestamp="2025-12-04 17:54:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:15.95690637 +0000 UTC m=+1727.317980772" watchObservedRunningTime="2025-12-04 17:55:15.958172971 +0000 UTC m=+1727.319247373" Dec 04 17:55:15 crc kubenswrapper[4948]: I1204 17:55:15.980307 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.568293641 podStartE2EDuration="35.980290303s" podCreationTimestamp="2025-12-04 17:54:40 +0000 UTC" firstStartedPulling="2025-12-04 17:54:42.209373754 +0000 UTC m=+1693.570448156" lastFinishedPulling="2025-12-04 17:55:08.621370406 +0000 UTC m=+1719.982444818" observedRunningTime="2025-12-04 17:55:15.978330411 +0000 UTC m=+1727.339404813" watchObservedRunningTime="2025-12-04 17:55:15.980290303 +0000 UTC m=+1727.341364705" Dec 04 17:55:16 crc kubenswrapper[4948]: I1204 17:55:16.901641 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rzjh8" event={"ID":"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d","Type":"ContainerStarted","Data":"8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0"} Dec 04 17:55:18 crc kubenswrapper[4948]: I1204 17:55:18.108141 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 04 17:55:19 crc kubenswrapper[4948]: I1204 17:55:19.927734 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cc3ac35-04df-4516-8623-b6a0d855c98a","Type":"ContainerStarted","Data":"f9414a835d8c13eaf98608d7189bcb0337dfc8008b5995dd6461769594b0b04b"} Dec 04 17:55:19 crc kubenswrapper[4948]: I1204 17:55:19.931475 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6840a402-94d3-48e6-9ccb-d578573e430a","Type":"ContainerStarted","Data":"055149cc29d5a8a0fbd7d07c17f45bd048d5a2c834bca9a88e2e890d03daf20f"} Dec 04 17:55:19 crc kubenswrapper[4948]: I1204 17:55:19.935630 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rzjh8" event={"ID":"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d","Type":"ContainerStarted","Data":"8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d"} Dec 04 17:55:19 crc kubenswrapper[4948]: I1204 17:55:19.936477 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:55:19 crc kubenswrapper[4948]: I1204 17:55:19.936512 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:55:19 crc kubenswrapper[4948]: I1204 17:55:19.987031 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=23.687362178 podStartE2EDuration="32.98700963s" podCreationTimestamp="2025-12-04 17:54:47 +0000 UTC" firstStartedPulling="2025-12-04 17:55:09.384560044 +0000 UTC m=+1720.745634446" lastFinishedPulling="2025-12-04 17:55:18.684207496 +0000 UTC m=+1730.045281898" observedRunningTime="2025-12-04 17:55:19.968614346 +0000 UTC m=+1731.329688768" watchObservedRunningTime="2025-12-04 17:55:19.98700963 +0000 UTC m=+1731.348084032" Dec 04 17:55:20 crc kubenswrapper[4948]: I1204 17:55:20.040133 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.968293381 podStartE2EDuration="29.040114016s" podCreationTimestamp="2025-12-04 17:54:51 +0000 UTC" firstStartedPulling="2025-12-04 17:55:08.627346676 +0000 UTC m=+1719.988421078" lastFinishedPulling="2025-12-04 17:55:18.699167311 +0000 UTC m=+1730.060241713" observedRunningTime="2025-12-04 17:55:20.036344277 +0000 UTC m=+1731.397418689" watchObservedRunningTime="2025-12-04 17:55:20.040114016 +0000 UTC m=+1731.401188428" Dec 04 17:55:20 crc kubenswrapper[4948]: I1204 17:55:20.061710 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rzjh8" podStartSLOduration=27.076173 podStartE2EDuration="32.061685752s" podCreationTimestamp="2025-12-04 17:54:48 +0000 UTC" firstStartedPulling="2025-12-04 17:55:09.022192645 +0000 UTC m=+1720.383267047" lastFinishedPulling="2025-12-04 17:55:14.007705397 +0000 UTC m=+1725.368779799" observedRunningTime="2025-12-04 17:55:20.056328451 +0000 UTC m=+1731.417402863" watchObservedRunningTime="2025-12-04 17:55:20.061685752 +0000 UTC m=+1731.422760154" Dec 04 17:55:20 crc kubenswrapper[4948]: I1204 17:55:20.947259 4948 generic.go:334] "Generic (PLEG): container finished" podID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" containerID="46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b" exitCode=0 Dec 04 17:55:20 crc kubenswrapper[4948]: I1204 17:55:20.947394 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" event={"ID":"8709b04e-a9d6-4d38-a0e7-dcc4e226be53","Type":"ContainerDied","Data":"46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b"} Dec 04 17:55:20 crc kubenswrapper[4948]: I1204 17:55:20.951023 4948 generic.go:334] "Generic (PLEG): container finished" podID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" containerID="635558bb8ecf4e918d92eecb52fb13d2a7f30e667c36d9f170a301a2e8fd8e29" exitCode=0 Dec 04 17:55:20 crc kubenswrapper[4948]: I1204 17:55:20.953082 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" event={"ID":"ef332ce3-f50d-49f9-a786-1d656f9bdf7d","Type":"ContainerDied","Data":"635558bb8ecf4e918d92eecb52fb13d2a7f30e667c36d9f170a301a2e8fd8e29"} Dec 04 17:55:21 crc kubenswrapper[4948]: I1204 17:55:21.553008 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 04 17:55:21 crc kubenswrapper[4948]: I1204 17:55:21.553095 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 04 17:55:21 crc kubenswrapper[4948]: I1204 17:55:21.681619 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 17:55:21 crc kubenswrapper[4948]: I1204 17:55:21.965502 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" event={"ID":"8709b04e-a9d6-4d38-a0e7-dcc4e226be53","Type":"ContainerStarted","Data":"b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369"} Dec 04 17:55:21 crc kubenswrapper[4948]: I1204 17:55:21.966988 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:55:21 crc kubenswrapper[4948]: I1204 17:55:21.971745 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" event={"ID":"ef332ce3-f50d-49f9-a786-1d656f9bdf7d","Type":"ContainerStarted","Data":"389f145562c17d1ec561b673f95cabfb77fc00d9e78f3eedac4f6276a1f0c890"} Dec 04 17:55:21 crc kubenswrapper[4948]: I1204 17:55:21.985320 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" podStartSLOduration=3.309743126 podStartE2EDuration="44.985290572s" podCreationTimestamp="2025-12-04 17:54:37 +0000 UTC" firstStartedPulling="2025-12-04 17:54:38.671185048 +0000 UTC m=+1690.032259450" lastFinishedPulling="2025-12-04 17:55:20.346732454 +0000 UTC m=+1731.707806896" observedRunningTime="2025-12-04 17:55:21.980269463 +0000 UTC m=+1733.341343865" watchObservedRunningTime="2025-12-04 17:55:21.985290572 +0000 UTC m=+1733.346364994" Dec 04 17:55:21 crc kubenswrapper[4948]: I1204 17:55:21.998106 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" podStartSLOduration=3.094344504 podStartE2EDuration="43.998089999s" podCreationTimestamp="2025-12-04 17:54:38 +0000 UTC" firstStartedPulling="2025-12-04 17:54:39.441640796 +0000 UTC m=+1690.802715198" lastFinishedPulling="2025-12-04 17:55:20.345386291 +0000 UTC m=+1731.706460693" observedRunningTime="2025-12-04 17:55:21.994771773 +0000 UTC m=+1733.355846185" watchObservedRunningTime="2025-12-04 17:55:21.998089999 +0000 UTC m=+1733.359164401" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.029332 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.039419 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.077259 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.476307 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.476393 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.543033 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.768567 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c83f-account-create-update-kx2n6"] Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.770316 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.773020 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.776965 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c83f-account-create-update-kx2n6"] Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.813105 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-t8qtr"] Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.814193 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.820787 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-t8qtr"] Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.929665 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0913a9-d96e-404a-9ece-85dc07caad20-operator-scripts\") pod \"keystone-db-create-t8qtr\" (UID: \"8d0913a9-d96e-404a-9ece-85dc07caad20\") " pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.929728 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fq9d\" (UniqueName: \"kubernetes.io/projected/ee248375-d52b-46cc-bef6-c6a53f95537e-kube-api-access-4fq9d\") pod \"keystone-c83f-account-create-update-kx2n6\" (UID: \"ee248375-d52b-46cc-bef6-c6a53f95537e\") " pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.929753 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqgcb\" (UniqueName: \"kubernetes.io/projected/8d0913a9-d96e-404a-9ece-85dc07caad20-kube-api-access-jqgcb\") pod \"keystone-db-create-t8qtr\" (UID: \"8d0913a9-d96e-404a-9ece-85dc07caad20\") " pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.929985 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee248375-d52b-46cc-bef6-c6a53f95537e-operator-scripts\") pod \"keystone-c83f-account-create-update-kx2n6\" (UID: \"ee248375-d52b-46cc-bef6-c6a53f95537e\") " pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:22 crc kubenswrapper[4948]: I1204 17:55:22.979950 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.031335 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-c8pjq"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.032706 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.037814 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.040530 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqgcb\" (UniqueName: \"kubernetes.io/projected/8d0913a9-d96e-404a-9ece-85dc07caad20-kube-api-access-jqgcb\") pod \"keystone-db-create-t8qtr\" (UID: \"8d0913a9-d96e-404a-9ece-85dc07caad20\") " pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.040651 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee248375-d52b-46cc-bef6-c6a53f95537e-operator-scripts\") pod \"keystone-c83f-account-create-update-kx2n6\" (UID: \"ee248375-d52b-46cc-bef6-c6a53f95537e\") " pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.040919 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0913a9-d96e-404a-9ece-85dc07caad20-operator-scripts\") pod \"keystone-db-create-t8qtr\" (UID: \"8d0913a9-d96e-404a-9ece-85dc07caad20\") " pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.041015 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fq9d\" (UniqueName: \"kubernetes.io/projected/ee248375-d52b-46cc-bef6-c6a53f95537e-kube-api-access-4fq9d\") pod \"keystone-c83f-account-create-update-kx2n6\" (UID: \"ee248375-d52b-46cc-bef6-c6a53f95537e\") " pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.041808 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-c8pjq"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.042163 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee248375-d52b-46cc-bef6-c6a53f95537e-operator-scripts\") pod \"keystone-c83f-account-create-update-kx2n6\" (UID: \"ee248375-d52b-46cc-bef6-c6a53f95537e\") " pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.042721 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0913a9-d96e-404a-9ece-85dc07caad20-operator-scripts\") pod \"keystone-db-create-t8qtr\" (UID: \"8d0913a9-d96e-404a-9ece-85dc07caad20\") " pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.056318 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.065162 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fq9d\" (UniqueName: \"kubernetes.io/projected/ee248375-d52b-46cc-bef6-c6a53f95537e-kube-api-access-4fq9d\") pod \"keystone-c83f-account-create-update-kx2n6\" (UID: \"ee248375-d52b-46cc-bef6-c6a53f95537e\") " pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.067195 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqgcb\" (UniqueName: \"kubernetes.io/projected/8d0913a9-d96e-404a-9ece-85dc07caad20-kube-api-access-jqgcb\") pod \"keystone-db-create-t8qtr\" (UID: \"8d0913a9-d96e-404a-9ece-85dc07caad20\") " pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.094735 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.134651 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.139471 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7046-account-create-update-9zvv4"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.141071 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.142675 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq5tz\" (UniqueName: \"kubernetes.io/projected/42c87540-53a6-4923-adcb-3af20aa678d1-kube-api-access-kq5tz\") pod \"placement-db-create-c8pjq\" (UID: \"42c87540-53a6-4923-adcb-3af20aa678d1\") " pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.143078 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42c87540-53a6-4923-adcb-3af20aa678d1-operator-scripts\") pod \"placement-db-create-c8pjq\" (UID: \"42c87540-53a6-4923-adcb-3af20aa678d1\") " pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.147847 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.210383 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7046-account-create-update-9zvv4"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.244912 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b875eb-4f81-407e-b0f1-12086316a557-operator-scripts\") pod \"placement-7046-account-create-update-9zvv4\" (UID: \"b6b875eb-4f81-407e-b0f1-12086316a557\") " pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.245217 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq5tz\" (UniqueName: \"kubernetes.io/projected/42c87540-53a6-4923-adcb-3af20aa678d1-kube-api-access-kq5tz\") pod \"placement-db-create-c8pjq\" (UID: \"42c87540-53a6-4923-adcb-3af20aa678d1\") " pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.245337 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm6vg\" (UniqueName: \"kubernetes.io/projected/b6b875eb-4f81-407e-b0f1-12086316a557-kube-api-access-rm6vg\") pod \"placement-7046-account-create-update-9zvv4\" (UID: \"b6b875eb-4f81-407e-b0f1-12086316a557\") " pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.245375 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42c87540-53a6-4923-adcb-3af20aa678d1-operator-scripts\") pod \"placement-db-create-c8pjq\" (UID: \"42c87540-53a6-4923-adcb-3af20aa678d1\") " pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.246404 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42c87540-53a6-4923-adcb-3af20aa678d1-operator-scripts\") pod \"placement-db-create-c8pjq\" (UID: \"42c87540-53a6-4923-adcb-3af20aa678d1\") " pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.272424 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.272467 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.273698 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq5tz\" (UniqueName: \"kubernetes.io/projected/42c87540-53a6-4923-adcb-3af20aa678d1-kube-api-access-kq5tz\") pod \"placement-db-create-c8pjq\" (UID: \"42c87540-53a6-4923-adcb-3af20aa678d1\") " pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.280994 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-llhqq"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.298598 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kgd58"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.299928 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.303780 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.316135 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-jz2bh"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.317335 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.320595 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.321681 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kgd58"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.338784 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-jz2bh"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.349939 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm6vg\" (UniqueName: \"kubernetes.io/projected/b6b875eb-4f81-407e-b0f1-12086316a557-kube-api-access-rm6vg\") pod \"placement-7046-account-create-update-9zvv4\" (UID: \"b6b875eb-4f81-407e-b0f1-12086316a557\") " pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.350174 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b875eb-4f81-407e-b0f1-12086316a557-operator-scripts\") pod \"placement-7046-account-create-update-9zvv4\" (UID: \"b6b875eb-4f81-407e-b0f1-12086316a557\") " pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.350858 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b875eb-4f81-407e-b0f1-12086316a557-operator-scripts\") pod \"placement-7046-account-create-update-9zvv4\" (UID: \"b6b875eb-4f81-407e-b0f1-12086316a557\") " pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.362772 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.390245 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm6vg\" (UniqueName: \"kubernetes.io/projected/b6b875eb-4f81-407e-b0f1-12086316a557-kube-api-access-rm6vg\") pod \"placement-7046-account-create-update-9zvv4\" (UID: \"b6b875eb-4f81-407e-b0f1-12086316a557\") " pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.414651 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.440113 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.441476 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.442897 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.443183 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.443358 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7pqq4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.443533 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.454401 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qlssc"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.454655 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" containerName="dnsmasq-dns" containerID="cri-o://389f145562c17d1ec561b673f95cabfb77fc00d9e78f3eedac4f6276a1f0c890" gracePeriod=10 Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456452 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456479 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hzjg\" (UniqueName: \"kubernetes.io/projected/64ae0228-b131-4cec-a52f-b5786c22355c-kube-api-access-4hzjg\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456497 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovs-rundir\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456519 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcdv5\" (UniqueName: \"kubernetes.io/projected/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-kube-api-access-qcdv5\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456542 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456613 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovn-rundir\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456633 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-combined-ca-bundle\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456687 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456704 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-config\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.456728 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ae0228-b131-4cec-a52f-b5786c22355c-config\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.459494 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.473889 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.526460 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-zxqkx"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.526853 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.527754 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.532973 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.533076 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zxqkx"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558503 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovn-rundir\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558745 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-combined-ca-bundle\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558774 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558819 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558837 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-config\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558868 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ae0228-b131-4cec-a52f-b5786c22355c-config\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558886 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558904 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558919 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558962 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558980 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hzjg\" (UniqueName: \"kubernetes.io/projected/64ae0228-b131-4cec-a52f-b5786c22355c-kube-api-access-4hzjg\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.558997 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovs-rundir\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.559016 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8sn7\" (UniqueName: \"kubernetes.io/projected/b6b365e8-6c2a-41fe-b50a-1702144d67d4-kube-api-access-k8sn7\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.559033 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcdv5\" (UniqueName: \"kubernetes.io/projected/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-kube-api-access-qcdv5\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.559074 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.559095 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.559115 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.559378 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovn-rundir\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.562398 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovs-rundir\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.564961 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.565698 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-config\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.565851 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ae0228-b131-4cec-a52f-b5786c22355c-config\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.566349 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.567471 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-combined-ca-bundle\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.573030 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.580317 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcdv5\" (UniqueName: \"kubernetes.io/projected/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-kube-api-access-qcdv5\") pod \"dnsmasq-dns-6bc7876d45-jz2bh\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.583871 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hzjg\" (UniqueName: \"kubernetes.io/projected/64ae0228-b131-4cec-a52f-b5786c22355c-kube-api-access-4hzjg\") pod \"ovn-controller-metrics-kgd58\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.644233 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kgd58" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.653671 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c83f-account-create-update-kx2n6"] Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661457 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8sn7\" (UniqueName: \"kubernetes.io/projected/b6b365e8-6c2a-41fe-b50a-1702144d67d4-kube-api-access-k8sn7\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661511 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661542 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-dns-svc\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661562 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661613 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv6sj\" (UniqueName: \"kubernetes.io/projected/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-kube-api-access-qv6sj\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661647 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661680 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661727 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-config\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661759 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661786 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661807 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.661828 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.662616 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.663105 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.663881 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.669610 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.669955 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.672978 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.676057 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.678450 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8sn7\" (UniqueName: \"kubernetes.io/projected/b6b365e8-6c2a-41fe-b50a-1702144d67d4-kube-api-access-k8sn7\") pod \"ovn-northd-0\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " pod="openstack/ovn-northd-0" Dec 04 17:55:23 crc kubenswrapper[4948]: W1204 17:55:23.679421 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee248375_d52b_46cc_bef6_c6a53f95537e.slice/crio-1de17f5fd9af2c10b9ac44e40dde03e1bb48c1c766ef94fe7947c046e8536a7d WatchSource:0}: Error finding container 1de17f5fd9af2c10b9ac44e40dde03e1bb48c1c766ef94fe7947c046e8536a7d: Status 404 returned error can't find the container with id 1de17f5fd9af2c10b9ac44e40dde03e1bb48c1c766ef94fe7947c046e8536a7d Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.763477 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-dns-svc\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.763553 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv6sj\" (UniqueName: \"kubernetes.io/projected/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-kube-api-access-qv6sj\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.763605 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.763641 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-config\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.763683 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.764796 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-dns-svc\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.764956 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.765172 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.765348 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-config\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.777431 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv6sj\" (UniqueName: \"kubernetes.io/projected/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-kube-api-access-qv6sj\") pod \"dnsmasq-dns-8554648995-zxqkx\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:23 crc kubenswrapper[4948]: I1204 17:55:23.782598 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.085186 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.172943 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-t8qtr"] Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.173005 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.177387 4948 generic.go:334] "Generic (PLEG): container finished" podID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" containerID="389f145562c17d1ec561b673f95cabfb77fc00d9e78f3eedac4f6276a1f0c890" exitCode=0 Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.177688 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" event={"ID":"ef332ce3-f50d-49f9-a786-1d656f9bdf7d","Type":"ContainerDied","Data":"389f145562c17d1ec561b673f95cabfb77fc00d9e78f3eedac4f6276a1f0c890"} Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.191301 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-jz2bh"] Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.213068 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c83f-account-create-update-kx2n6" event={"ID":"ee248375-d52b-46cc-bef6-c6a53f95537e","Type":"ContainerStarted","Data":"1de17f5fd9af2c10b9ac44e40dde03e1bb48c1c766ef94fe7947c046e8536a7d"} Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.213269 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" containerName="dnsmasq-dns" containerID="cri-o://b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369" gracePeriod=10 Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.223252 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rp5mc"] Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.227254 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.241482 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rp5mc"] Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.260408 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-c8pjq"] Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.357813 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7046-account-create-update-9zvv4"] Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.381685 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxm9\" (UniqueName: \"kubernetes.io/projected/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-kube-api-access-5bxm9\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.381856 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.381888 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.381945 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-config\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.382023 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.437658 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.483243 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxm9\" (UniqueName: \"kubernetes.io/projected/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-kube-api-access-5bxm9\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.483367 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.483389 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.483436 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-config\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.483472 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.484996 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.485909 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.486404 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.486868 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-config\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.510180 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxm9\" (UniqueName: \"kubernetes.io/projected/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-kube-api-access-5bxm9\") pod \"dnsmasq-dns-b8fbc5445-rp5mc\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.554413 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kgd58"] Dec 04 17:55:25 crc kubenswrapper[4948]: E1204 17:55:25.583430 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8709b04e_a9d6_4d38_a0e7_dcc4e226be53.slice/crio-conmon-b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8709b04e_a9d6_4d38_a0e7_dcc4e226be53.slice/crio-b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369.scope\": RecentStats: unable to find data in memory cache]" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.620689 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.627161 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.688358 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-config\") pod \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.688735 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-dns-svc\") pod \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.688773 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xbtn\" (UniqueName: \"kubernetes.io/projected/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-kube-api-access-2xbtn\") pod \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\" (UID: \"ef332ce3-f50d-49f9-a786-1d656f9bdf7d\") " Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.717295 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-kube-api-access-2xbtn" (OuterVolumeSpecName: "kube-api-access-2xbtn") pod "ef332ce3-f50d-49f9-a786-1d656f9bdf7d" (UID: "ef332ce3-f50d-49f9-a786-1d656f9bdf7d"). InnerVolumeSpecName "kube-api-access-2xbtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.724688 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.728990 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-jz2bh"] Dec 04 17:55:25 crc kubenswrapper[4948]: W1204 17:55:25.761011 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aafa5e3_5239_4ac9_a3d8_a2c11ec411fd.slice/crio-ea76fa0757dcf2755fa052e215aca0731d3df4331bbf5b229935b7358c677947 WatchSource:0}: Error finding container ea76fa0757dcf2755fa052e215aca0731d3df4331bbf5b229935b7358c677947: Status 404 returned error can't find the container with id ea76fa0757dcf2755fa052e215aca0731d3df4331bbf5b229935b7358c677947 Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.790272 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mht69\" (UniqueName: \"kubernetes.io/projected/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-kube-api-access-mht69\") pod \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.790430 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-dns-svc\") pod \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.790519 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-config\") pod \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\" (UID: \"8709b04e-a9d6-4d38-a0e7-dcc4e226be53\") " Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.791229 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xbtn\" (UniqueName: \"kubernetes.io/projected/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-kube-api-access-2xbtn\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.793190 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-kube-api-access-mht69" (OuterVolumeSpecName: "kube-api-access-mht69") pod "8709b04e-a9d6-4d38-a0e7-dcc4e226be53" (UID: "8709b04e-a9d6-4d38-a0e7-dcc4e226be53"). InnerVolumeSpecName "kube-api-access-mht69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.798444 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-config" (OuterVolumeSpecName: "config") pod "ef332ce3-f50d-49f9-a786-1d656f9bdf7d" (UID: "ef332ce3-f50d-49f9-a786-1d656f9bdf7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.814922 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef332ce3-f50d-49f9-a786-1d656f9bdf7d" (UID: "ef332ce3-f50d-49f9-a786-1d656f9bdf7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.833271 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8709b04e-a9d6-4d38-a0e7-dcc4e226be53" (UID: "8709b04e-a9d6-4d38-a0e7-dcc4e226be53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.841666 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-config" (OuterVolumeSpecName: "config") pod "8709b04e-a9d6-4d38-a0e7-dcc4e226be53" (UID: "8709b04e-a9d6-4d38-a0e7-dcc4e226be53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.870135 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zxqkx"] Dec 04 17:55:25 crc kubenswrapper[4948]: W1204 17:55:25.881800 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5536cf0a_e6a0_4932_88ae_6fb9f6dfbb30.slice/crio-1c3f155cb50187b8d908383b7c6c967587f91315bc926aecfe8039e5e6ae880a WatchSource:0}: Error finding container 1c3f155cb50187b8d908383b7c6c967587f91315bc926aecfe8039e5e6ae880a: Status 404 returned error can't find the container with id 1c3f155cb50187b8d908383b7c6c967587f91315bc926aecfe8039e5e6ae880a Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.883268 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.893681 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.893726 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.893739 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mht69\" (UniqueName: \"kubernetes.io/projected/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-kube-api-access-mht69\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.893750 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8709b04e-a9d6-4d38-a0e7-dcc4e226be53-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:25 crc kubenswrapper[4948]: I1204 17:55:25.893760 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef332ce3-f50d-49f9-a786-1d656f9bdf7d-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.137310 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.148905 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" containerName="dnsmasq-dns" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.148922 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" containerName="dnsmasq-dns" Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.148935 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" containerName="dnsmasq-dns" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.148941 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" containerName="dnsmasq-dns" Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.148953 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" containerName="init" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.148959 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" containerName="init" Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.148970 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" containerName="init" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.148975 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" containerName="init" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.149138 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" containerName="dnsmasq-dns" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.149157 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" containerName="dnsmasq-dns" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.154252 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.163301 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.163487 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.163564 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-bgctv" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.163663 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.178987 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.223480 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rp5mc"] Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.258138 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kgd58" event={"ID":"64ae0228-b131-4cec-a52f-b5786c22355c","Type":"ContainerStarted","Data":"5111ccd42a2dcb9a24627bf842d9a0b851e3ae53f8f5be34d0dd24d8c4061014"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.258181 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kgd58" event={"ID":"64ae0228-b131-4cec-a52f-b5786c22355c","Type":"ContainerStarted","Data":"4f7838c6613c72321ee3bd3a66ad70afa336dcd4e741b7c4ca175872588a253b"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.262169 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" event={"ID":"ef332ce3-f50d-49f9-a786-1d656f9bdf7d","Type":"ContainerDied","Data":"96e9b630c73ad15fcf7209ac9741609b6ceb5d177561b09c9421835174464568"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.262191 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-qlssc" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.262204 4948 scope.go:117] "RemoveContainer" containerID="389f145562c17d1ec561b673f95cabfb77fc00d9e78f3eedac4f6276a1f0c890" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.265705 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7046-account-create-update-9zvv4" event={"ID":"b6b875eb-4f81-407e-b0f1-12086316a557","Type":"ContainerStarted","Data":"afadd5bc8b50ff866da0f039ab345ad38c988c0f86ccd90c03589dbd3fca1a90"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.265730 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7046-account-create-update-9zvv4" event={"ID":"b6b875eb-4f81-407e-b0f1-12086316a557","Type":"ContainerStarted","Data":"81b07549d5ca0f26e7a47aedbcccbaff78433783a44b9ba404c9b4fa2a25ae7b"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.285363 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b6b365e8-6c2a-41fe-b50a-1702144d67d4","Type":"ContainerStarted","Data":"de06dd9e6d7c027a8d179be989286ee149c6150e569ba7775bf42ec93e14bab0"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.300240 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kgd58" podStartSLOduration=3.300223468 podStartE2EDuration="3.300223468s" podCreationTimestamp="2025-12-04 17:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:26.285678446 +0000 UTC m=+1737.646752838" watchObservedRunningTime="2025-12-04 17:55:26.300223468 +0000 UTC m=+1737.661297870" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.304652 4948 generic.go:334] "Generic (PLEG): container finished" podID="9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" containerID="845491e5d99ac34f52d4280ef7562a97c7b50c24d13195d5224b352388f2d395" exitCode=0 Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.304725 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" event={"ID":"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd","Type":"ContainerDied","Data":"845491e5d99ac34f52d4280ef7562a97c7b50c24d13195d5224b352388f2d395"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.304751 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" event={"ID":"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd","Type":"ContainerStarted","Data":"ea76fa0757dcf2755fa052e215aca0731d3df4331bbf5b229935b7358c677947"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.337664 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c83f-account-create-update-kx2n6" event={"ID":"ee248375-d52b-46cc-bef6-c6a53f95537e","Type":"ContainerStarted","Data":"13f1ec3c600161183c7b13c25fb8ff3c4a268954e2d40bac4b5524f004c61111"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.343647 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.343728 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.343756 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-567f8\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-kube-api-access-567f8\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.343786 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-cache\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.343837 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-lock\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.359992 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7046-account-create-update-9zvv4" podStartSLOduration=3.359970705 podStartE2EDuration="3.359970705s" podCreationTimestamp="2025-12-04 17:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:26.336594043 +0000 UTC m=+1737.697668445" watchObservedRunningTime="2025-12-04 17:55:26.359970705 +0000 UTC m=+1737.721045117" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.361878 4948 scope.go:117] "RemoveContainer" containerID="635558bb8ecf4e918d92eecb52fb13d2a7f30e667c36d9f170a301a2e8fd8e29" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.387672 4948 generic.go:334] "Generic (PLEG): container finished" podID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" containerID="b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369" exitCode=0 Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.387737 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" event={"ID":"8709b04e-a9d6-4d38-a0e7-dcc4e226be53","Type":"ContainerDied","Data":"b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.387767 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" event={"ID":"8709b04e-a9d6-4d38-a0e7-dcc4e226be53","Type":"ContainerDied","Data":"ba8dd40995df63d26861e15f2328ad104f87e91000e15a6dbedbcc7dc5f4a62f"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.387838 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-llhqq" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.395210 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2wcvz"] Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.396567 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.401352 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.401366 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.401418 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.403533 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2wcvz"] Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.407466 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c8pjq" event={"ID":"42c87540-53a6-4923-adcb-3af20aa678d1","Type":"ContainerStarted","Data":"4e309c03d554eea0bb3db4cf9ff24a3b3fa6b44b749e0a2339c0c21f05783d2a"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.407489 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c8pjq" event={"ID":"42c87540-53a6-4923-adcb-3af20aa678d1","Type":"ContainerStarted","Data":"12f80b117ca5271aea4e2b489285a7b5431986cb7fe3deb774e1cade55fabf6f"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.422619 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t8qtr" event={"ID":"8d0913a9-d96e-404a-9ece-85dc07caad20","Type":"ContainerStarted","Data":"9d872f42509c6df89e49d65db6b6dc809cb71b73f7f05093b33b930bc565da60"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.422651 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t8qtr" event={"ID":"8d0913a9-d96e-404a-9ece-85dc07caad20","Type":"ContainerStarted","Data":"c03a054fc0d29ea4d76b9943d5680e5d96db27d3345681c0e439228c73f72009"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.429263 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zxqkx" event={"ID":"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30","Type":"ContainerStarted","Data":"1c3f155cb50187b8d908383b7c6c967587f91315bc926aecfe8039e5e6ae880a"} Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.441569 4948 scope.go:117] "RemoveContainer" containerID="b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.444977 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.445011 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-567f8\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-kube-api-access-567f8\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.445055 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-cache\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.445099 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-lock\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.445194 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.445678 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-cache\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.446069 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-lock\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.446375 4948 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.446390 4948 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.446427 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift podName:6bc62dd5-67bd-4e26-bedb-58e1d56abac9 nodeName:}" failed. No retries permitted until 2025-12-04 17:55:26.946413289 +0000 UTC m=+1738.307487691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift") pod "swift-storage-0" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9") : configmap "swift-ring-files" not found Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.446815 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.469283 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-567f8\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-kube-api-access-567f8\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.486783 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qlssc"] Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.503374 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-qlssc"] Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.505686 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-t8qtr" podStartSLOduration=4.505676562 podStartE2EDuration="4.505676562s" podCreationTimestamp="2025-12-04 17:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:26.464604137 +0000 UTC m=+1737.825678539" watchObservedRunningTime="2025-12-04 17:55:26.505676562 +0000 UTC m=+1737.866750964" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.516007 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-llhqq"] Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.521917 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.534294 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-llhqq"] Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.543594 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-c8pjq" podStartSLOduration=3.543577075 podStartE2EDuration="3.543577075s" podCreationTimestamp="2025-12-04 17:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:26.497780611 +0000 UTC m=+1737.858855013" watchObservedRunningTime="2025-12-04 17:55:26.543577075 +0000 UTC m=+1737.904651477" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.556556 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd8wd\" (UniqueName: \"kubernetes.io/projected/c74958a4-caed-4579-b0ff-cbabe46b09dd-kube-api-access-fd8wd\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.556586 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-ring-data-devices\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.556654 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-dispersionconf\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.556689 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-scripts\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.556754 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-combined-ca-bundle\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.556831 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c74958a4-caed-4579-b0ff-cbabe46b09dd-etc-swift\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.556874 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-swiftconf\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.563765 4948 scope.go:117] "RemoveContainer" containerID="46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.627147 4948 scope.go:117] "RemoveContainer" containerID="b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369" Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.628221 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369\": container with ID starting with b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369 not found: ID does not exist" containerID="b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.628253 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369"} err="failed to get container status \"b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369\": rpc error: code = NotFound desc = could not find container \"b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369\": container with ID starting with b508319574f5f6f78ec03ab5de2a07be2ad4d3e40d20aaf85459cfffcd337369 not found: ID does not exist" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.628275 4948 scope.go:117] "RemoveContainer" containerID="46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b" Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.628483 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b\": container with ID starting with 46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b not found: ID does not exist" containerID="46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.628521 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b"} err="failed to get container status \"46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b\": rpc error: code = NotFound desc = could not find container \"46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b\": container with ID starting with 46b802276da78cb8dbf97be3985eee2ead34b84b537f1f48346f4a39a5f5792b not found: ID does not exist" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.642386 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.658139 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-swiftconf\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.658238 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd8wd\" (UniqueName: \"kubernetes.io/projected/c74958a4-caed-4579-b0ff-cbabe46b09dd-kube-api-access-fd8wd\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.658261 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-ring-data-devices\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.658557 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-dispersionconf\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.659314 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-ring-data-devices\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.659371 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-scripts\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.659472 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-combined-ca-bundle\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.659527 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c74958a4-caed-4579-b0ff-cbabe46b09dd-etc-swift\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.659786 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c74958a4-caed-4579-b0ff-cbabe46b09dd-etc-swift\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.660398 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-scripts\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.663995 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-dispersionconf\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.664487 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-combined-ca-bundle\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.664826 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-swiftconf\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.677560 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd8wd\" (UniqueName: \"kubernetes.io/projected/c74958a4-caed-4579-b0ff-cbabe46b09dd-kube-api-access-fd8wd\") pod \"swift-ring-rebalance-2wcvz\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.760515 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-dns-svc\") pod \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.760629 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcdv5\" (UniqueName: \"kubernetes.io/projected/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-kube-api-access-qcdv5\") pod \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.760687 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-ovsdbserver-sb\") pod \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.760713 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-config\") pod \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\" (UID: \"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd\") " Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.766967 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-kube-api-access-qcdv5" (OuterVolumeSpecName: "kube-api-access-qcdv5") pod "9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" (UID: "9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd"). InnerVolumeSpecName "kube-api-access-qcdv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.782101 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" (UID: "9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.782724 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-config" (OuterVolumeSpecName: "config") pod "9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" (UID: "9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.799572 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" (UID: "9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.828204 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.864217 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.864671 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcdv5\" (UniqueName: \"kubernetes.io/projected/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-kube-api-access-qcdv5\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.864690 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.864702 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.946062 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8709b04e-a9d6-4d38-a0e7-dcc4e226be53" path="/var/lib/kubelet/pods/8709b04e-a9d6-4d38-a0e7-dcc4e226be53/volumes" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.948244 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef332ce3-f50d-49f9-a786-1d656f9bdf7d" path="/var/lib/kubelet/pods/ef332ce3-f50d-49f9-a786-1d656f9bdf7d/volumes" Dec 04 17:55:26 crc kubenswrapper[4948]: I1204 17:55:26.966011 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.966236 4948 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.966256 4948 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 17:55:26 crc kubenswrapper[4948]: E1204 17:55:26.966295 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift podName:6bc62dd5-67bd-4e26-bedb-58e1d56abac9 nodeName:}" failed. No retries permitted until 2025-12-04 17:55:27.96628201 +0000 UTC m=+1739.327356412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift") pod "swift-storage-0" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9") : configmap "swift-ring-files" not found Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.300365 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2wcvz"] Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.446821 4948 generic.go:334] "Generic (PLEG): container finished" podID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerID="4459d752207e41a642608331152bc2124974354406bb5f2442e65c39b36eab1b" exitCode=0 Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.446891 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zxqkx" event={"ID":"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30","Type":"ContainerDied","Data":"4459d752207e41a642608331152bc2124974354406bb5f2442e65c39b36eab1b"} Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.449847 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.450016 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-jz2bh" event={"ID":"9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd","Type":"ContainerDied","Data":"ea76fa0757dcf2755fa052e215aca0731d3df4331bbf5b229935b7358c677947"} Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.450078 4948 scope.go:117] "RemoveContainer" containerID="845491e5d99ac34f52d4280ef7562a97c7b50c24d13195d5224b352388f2d395" Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.454633 4948 generic.go:334] "Generic (PLEG): container finished" podID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" containerID="faf818866523b856be156999c88d5833f4148f5dd491ab22a6cad19520684a39" exitCode=0 Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.454679 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" event={"ID":"6e2de27d-b6aa-42a4-a11e-c9241d8b619d","Type":"ContainerDied","Data":"faf818866523b856be156999c88d5833f4148f5dd491ab22a6cad19520684a39"} Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.454694 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" event={"ID":"6e2de27d-b6aa-42a4-a11e-c9241d8b619d","Type":"ContainerStarted","Data":"9944f73f50fef25ae809ad5782ad4fd887c7f3fbef550db75b631d6c7bccfbae"} Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.457187 4948 generic.go:334] "Generic (PLEG): container finished" podID="ee248375-d52b-46cc-bef6-c6a53f95537e" containerID="13f1ec3c600161183c7b13c25fb8ff3c4a268954e2d40bac4b5524f004c61111" exitCode=0 Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.457227 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c83f-account-create-update-kx2n6" event={"ID":"ee248375-d52b-46cc-bef6-c6a53f95537e","Type":"ContainerDied","Data":"13f1ec3c600161183c7b13c25fb8ff3c4a268954e2d40bac4b5524f004c61111"} Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.459388 4948 generic.go:334] "Generic (PLEG): container finished" podID="42c87540-53a6-4923-adcb-3af20aa678d1" containerID="4e309c03d554eea0bb3db4cf9ff24a3b3fa6b44b749e0a2339c0c21f05783d2a" exitCode=0 Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.459446 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c8pjq" event={"ID":"42c87540-53a6-4923-adcb-3af20aa678d1","Type":"ContainerDied","Data":"4e309c03d554eea0bb3db4cf9ff24a3b3fa6b44b749e0a2339c0c21f05783d2a"} Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.462643 4948 generic.go:334] "Generic (PLEG): container finished" podID="8d0913a9-d96e-404a-9ece-85dc07caad20" containerID="9d872f42509c6df89e49d65db6b6dc809cb71b73f7f05093b33b930bc565da60" exitCode=0 Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.462694 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t8qtr" event={"ID":"8d0913a9-d96e-404a-9ece-85dc07caad20","Type":"ContainerDied","Data":"9d872f42509c6df89e49d65db6b6dc809cb71b73f7f05093b33b930bc565da60"} Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.464906 4948 generic.go:334] "Generic (PLEG): container finished" podID="b6b875eb-4f81-407e-b0f1-12086316a557" containerID="afadd5bc8b50ff866da0f039ab345ad38c988c0f86ccd90c03589dbd3fca1a90" exitCode=0 Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.465535 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7046-account-create-update-9zvv4" event={"ID":"b6b875eb-4f81-407e-b0f1-12086316a557","Type":"ContainerDied","Data":"afadd5bc8b50ff866da0f039ab345ad38c988c0f86ccd90c03589dbd3fca1a90"} Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.550529 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-jz2bh"] Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.563209 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-jz2bh"] Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.913811 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:55:27 crc kubenswrapper[4948]: E1204 17:55:27.914948 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:55:27 crc kubenswrapper[4948]: I1204 17:55:27.991107 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:27 crc kubenswrapper[4948]: E1204 17:55:27.991408 4948 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 17:55:27 crc kubenswrapper[4948]: E1204 17:55:27.991430 4948 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 17:55:27 crc kubenswrapper[4948]: E1204 17:55:27.991465 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift podName:6bc62dd5-67bd-4e26-bedb-58e1d56abac9 nodeName:}" failed. No retries permitted until 2025-12-04 17:55:29.991452558 +0000 UTC m=+1741.352526960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift") pod "swift-storage-0" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9") : configmap "swift-ring-files" not found Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.321887 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ml66n"] Dec 04 17:55:28 crc kubenswrapper[4948]: E1204 17:55:28.322534 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" containerName="init" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.322561 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" containerName="init" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.322897 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" containerName="init" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.323843 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ml66n" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.348462 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ml66n"] Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.397600 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-operator-scripts\") pod \"glance-db-create-ml66n\" (UID: \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\") " pod="openstack/glance-db-create-ml66n" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.397658 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc67m\" (UniqueName: \"kubernetes.io/projected/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-kube-api-access-fc67m\") pod \"glance-db-create-ml66n\" (UID: \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\") " pod="openstack/glance-db-create-ml66n" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.433711 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-47ce-account-create-update-prrl6"] Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.434959 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.437152 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.451898 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-47ce-account-create-update-prrl6"] Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.499463 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc67m\" (UniqueName: \"kubernetes.io/projected/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-kube-api-access-fc67m\") pod \"glance-db-create-ml66n\" (UID: \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\") " pod="openstack/glance-db-create-ml66n" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.500023 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-operator-scripts\") pod \"glance-db-create-ml66n\" (UID: \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\") " pod="openstack/glance-db-create-ml66n" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.500804 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-operator-scripts\") pod \"glance-db-create-ml66n\" (UID: \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\") " pod="openstack/glance-db-create-ml66n" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.520508 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc67m\" (UniqueName: \"kubernetes.io/projected/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-kube-api-access-fc67m\") pod \"glance-db-create-ml66n\" (UID: \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\") " pod="openstack/glance-db-create-ml66n" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.602054 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a24421a3-5139-4a65-b91e-8915d1b96103-operator-scripts\") pod \"glance-47ce-account-create-update-prrl6\" (UID: \"a24421a3-5139-4a65-b91e-8915d1b96103\") " pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.602099 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h25b\" (UniqueName: \"kubernetes.io/projected/a24421a3-5139-4a65-b91e-8915d1b96103-kube-api-access-2h25b\") pod \"glance-47ce-account-create-update-prrl6\" (UID: \"a24421a3-5139-4a65-b91e-8915d1b96103\") " pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.645110 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ml66n" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.703640 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a24421a3-5139-4a65-b91e-8915d1b96103-operator-scripts\") pod \"glance-47ce-account-create-update-prrl6\" (UID: \"a24421a3-5139-4a65-b91e-8915d1b96103\") " pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.703680 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h25b\" (UniqueName: \"kubernetes.io/projected/a24421a3-5139-4a65-b91e-8915d1b96103-kube-api-access-2h25b\") pod \"glance-47ce-account-create-update-prrl6\" (UID: \"a24421a3-5139-4a65-b91e-8915d1b96103\") " pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.704750 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a24421a3-5139-4a65-b91e-8915d1b96103-operator-scripts\") pod \"glance-47ce-account-create-update-prrl6\" (UID: \"a24421a3-5139-4a65-b91e-8915d1b96103\") " pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.733140 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h25b\" (UniqueName: \"kubernetes.io/projected/a24421a3-5139-4a65-b91e-8915d1b96103-kube-api-access-2h25b\") pod \"glance-47ce-account-create-update-prrl6\" (UID: \"a24421a3-5139-4a65-b91e-8915d1b96103\") " pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.754566 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:28 crc kubenswrapper[4948]: W1204 17:55:28.868515 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc74958a4_caed_4579_b0ff_cbabe46b09dd.slice/crio-5fba489d3bdcd576861b11e62ba7976f504dbee1494a255b54e7242589ecd2b2 WatchSource:0}: Error finding container 5fba489d3bdcd576861b11e62ba7976f504dbee1494a255b54e7242589ecd2b2: Status 404 returned error can't find the container with id 5fba489d3bdcd576861b11e62ba7976f504dbee1494a255b54e7242589ecd2b2 Dec 04 17:55:28 crc kubenswrapper[4948]: I1204 17:55:28.929134 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd" path="/var/lib/kubelet/pods/9aafa5e3-5239-4ac9-a3d8-a2c11ec411fd/volumes" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.016643 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.024084 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.027999 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.078512 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.111454 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0913a9-d96e-404a-9ece-85dc07caad20-operator-scripts\") pod \"8d0913a9-d96e-404a-9ece-85dc07caad20\" (UID: \"8d0913a9-d96e-404a-9ece-85dc07caad20\") " Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.111842 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fq9d\" (UniqueName: \"kubernetes.io/projected/ee248375-d52b-46cc-bef6-c6a53f95537e-kube-api-access-4fq9d\") pod \"ee248375-d52b-46cc-bef6-c6a53f95537e\" (UID: \"ee248375-d52b-46cc-bef6-c6a53f95537e\") " Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.111870 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq5tz\" (UniqueName: \"kubernetes.io/projected/42c87540-53a6-4923-adcb-3af20aa678d1-kube-api-access-kq5tz\") pod \"42c87540-53a6-4923-adcb-3af20aa678d1\" (UID: \"42c87540-53a6-4923-adcb-3af20aa678d1\") " Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.111902 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqgcb\" (UniqueName: \"kubernetes.io/projected/8d0913a9-d96e-404a-9ece-85dc07caad20-kube-api-access-jqgcb\") pod \"8d0913a9-d96e-404a-9ece-85dc07caad20\" (UID: \"8d0913a9-d96e-404a-9ece-85dc07caad20\") " Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.111925 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee248375-d52b-46cc-bef6-c6a53f95537e-operator-scripts\") pod \"ee248375-d52b-46cc-bef6-c6a53f95537e\" (UID: \"ee248375-d52b-46cc-bef6-c6a53f95537e\") " Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.112030 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42c87540-53a6-4923-adcb-3af20aa678d1-operator-scripts\") pod \"42c87540-53a6-4923-adcb-3af20aa678d1\" (UID: \"42c87540-53a6-4923-adcb-3af20aa678d1\") " Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.112810 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d0913a9-d96e-404a-9ece-85dc07caad20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d0913a9-d96e-404a-9ece-85dc07caad20" (UID: "8d0913a9-d96e-404a-9ece-85dc07caad20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.112884 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee248375-d52b-46cc-bef6-c6a53f95537e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee248375-d52b-46cc-bef6-c6a53f95537e" (UID: "ee248375-d52b-46cc-bef6-c6a53f95537e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.112951 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c87540-53a6-4923-adcb-3af20aa678d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42c87540-53a6-4923-adcb-3af20aa678d1" (UID: "42c87540-53a6-4923-adcb-3af20aa678d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.116054 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0913a9-d96e-404a-9ece-85dc07caad20-kube-api-access-jqgcb" (OuterVolumeSpecName: "kube-api-access-jqgcb") pod "8d0913a9-d96e-404a-9ece-85dc07caad20" (UID: "8d0913a9-d96e-404a-9ece-85dc07caad20"). InnerVolumeSpecName "kube-api-access-jqgcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.116713 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee248375-d52b-46cc-bef6-c6a53f95537e-kube-api-access-4fq9d" (OuterVolumeSpecName: "kube-api-access-4fq9d") pod "ee248375-d52b-46cc-bef6-c6a53f95537e" (UID: "ee248375-d52b-46cc-bef6-c6a53f95537e"). InnerVolumeSpecName "kube-api-access-4fq9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.120174 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c87540-53a6-4923-adcb-3af20aa678d1-kube-api-access-kq5tz" (OuterVolumeSpecName: "kube-api-access-kq5tz") pod "42c87540-53a6-4923-adcb-3af20aa678d1" (UID: "42c87540-53a6-4923-adcb-3af20aa678d1"). InnerVolumeSpecName "kube-api-access-kq5tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.216801 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm6vg\" (UniqueName: \"kubernetes.io/projected/b6b875eb-4f81-407e-b0f1-12086316a557-kube-api-access-rm6vg\") pod \"b6b875eb-4f81-407e-b0f1-12086316a557\" (UID: \"b6b875eb-4f81-407e-b0f1-12086316a557\") " Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.216859 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b875eb-4f81-407e-b0f1-12086316a557-operator-scripts\") pod \"b6b875eb-4f81-407e-b0f1-12086316a557\" (UID: \"b6b875eb-4f81-407e-b0f1-12086316a557\") " Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.217365 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42c87540-53a6-4923-adcb-3af20aa678d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.217379 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0913a9-d96e-404a-9ece-85dc07caad20-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.217389 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fq9d\" (UniqueName: \"kubernetes.io/projected/ee248375-d52b-46cc-bef6-c6a53f95537e-kube-api-access-4fq9d\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.217403 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq5tz\" (UniqueName: \"kubernetes.io/projected/42c87540-53a6-4923-adcb-3af20aa678d1-kube-api-access-kq5tz\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.217411 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqgcb\" (UniqueName: \"kubernetes.io/projected/8d0913a9-d96e-404a-9ece-85dc07caad20-kube-api-access-jqgcb\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.217419 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee248375-d52b-46cc-bef6-c6a53f95537e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.222199 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b875eb-4f81-407e-b0f1-12086316a557-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6b875eb-4f81-407e-b0f1-12086316a557" (UID: "b6b875eb-4f81-407e-b0f1-12086316a557"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.235453 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b875eb-4f81-407e-b0f1-12086316a557-kube-api-access-rm6vg" (OuterVolumeSpecName: "kube-api-access-rm6vg") pod "b6b875eb-4f81-407e-b0f1-12086316a557" (UID: "b6b875eb-4f81-407e-b0f1-12086316a557"). InnerVolumeSpecName "kube-api-access-rm6vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.320180 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm6vg\" (UniqueName: \"kubernetes.io/projected/b6b875eb-4f81-407e-b0f1-12086316a557-kube-api-access-rm6vg\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.320585 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b875eb-4f81-407e-b0f1-12086316a557-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.481408 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2wcvz" event={"ID":"c74958a4-caed-4579-b0ff-cbabe46b09dd","Type":"ContainerStarted","Data":"5fba489d3bdcd576861b11e62ba7976f504dbee1494a255b54e7242589ecd2b2"} Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.486615 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c83f-account-create-update-kx2n6" event={"ID":"ee248375-d52b-46cc-bef6-c6a53f95537e","Type":"ContainerDied","Data":"1de17f5fd9af2c10b9ac44e40dde03e1bb48c1c766ef94fe7947c046e8536a7d"} Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.486663 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1de17f5fd9af2c10b9ac44e40dde03e1bb48c1c766ef94fe7947c046e8536a7d" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.486674 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c83f-account-create-update-kx2n6" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.492017 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-c8pjq" event={"ID":"42c87540-53a6-4923-adcb-3af20aa678d1","Type":"ContainerDied","Data":"12f80b117ca5271aea4e2b489285a7b5431986cb7fe3deb774e1cade55fabf6f"} Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.492078 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12f80b117ca5271aea4e2b489285a7b5431986cb7fe3deb774e1cade55fabf6f" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.492157 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-c8pjq" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.495388 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t8qtr" event={"ID":"8d0913a9-d96e-404a-9ece-85dc07caad20","Type":"ContainerDied","Data":"c03a054fc0d29ea4d76b9943d5680e5d96db27d3345681c0e439228c73f72009"} Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.495406 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t8qtr" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.495421 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03a054fc0d29ea4d76b9943d5680e5d96db27d3345681c0e439228c73f72009" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.497128 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7046-account-create-update-9zvv4" event={"ID":"b6b875eb-4f81-407e-b0f1-12086316a557","Type":"ContainerDied","Data":"81b07549d5ca0f26e7a47aedbcccbaff78433783a44b9ba404c9b4fa2a25ae7b"} Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.497148 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b07549d5ca0f26e7a47aedbcccbaff78433783a44b9ba404c9b4fa2a25ae7b" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.497213 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7046-account-create-update-9zvv4" Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.596384 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ml66n"] Dec 04 17:55:29 crc kubenswrapper[4948]: W1204 17:55:29.596778 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66b5b4ee_a4ce_416c_8807_e5fe61c9c59d.slice/crio-00f03dbcfdfd4aad91ab2a471f65d0649c5c8aa3ac38f2004d1b6d193f931a05 WatchSource:0}: Error finding container 00f03dbcfdfd4aad91ab2a471f65d0649c5c8aa3ac38f2004d1b6d193f931a05: Status 404 returned error can't find the container with id 00f03dbcfdfd4aad91ab2a471f65d0649c5c8aa3ac38f2004d1b6d193f931a05 Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.663209 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-47ce-account-create-update-prrl6"] Dec 04 17:55:29 crc kubenswrapper[4948]: I1204 17:55:29.671820 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.032268 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:30 crc kubenswrapper[4948]: E1204 17:55:30.033300 4948 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 17:55:30 crc kubenswrapper[4948]: E1204 17:55:30.033319 4948 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 17:55:30 crc kubenswrapper[4948]: E1204 17:55:30.033353 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift podName:6bc62dd5-67bd-4e26-bedb-58e1d56abac9 nodeName:}" failed. No retries permitted until 2025-12-04 17:55:34.033340785 +0000 UTC m=+1745.394415187 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift") pod "swift-storage-0" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9") : configmap "swift-ring-files" not found Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.505572 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" event={"ID":"6e2de27d-b6aa-42a4-a11e-c9241d8b619d","Type":"ContainerStarted","Data":"92e0bc1e9eaccb74286935aa62862efcfa21a6d8d06ec096b9c55ab939a73593"} Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.506604 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.509664 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-47ce-account-create-update-prrl6" event={"ID":"a24421a3-5139-4a65-b91e-8915d1b96103","Type":"ContainerStarted","Data":"b6183c62bde6cb6fca075549c3d9d9e84661daae0b28f33907be13b8f3bc5e84"} Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.509708 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-47ce-account-create-update-prrl6" event={"ID":"a24421a3-5139-4a65-b91e-8915d1b96103","Type":"ContainerStarted","Data":"001ea9963964cdc4d73bb1b451c07a993783909d953881b24f65d22fa0097d2b"} Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.514095 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zxqkx" event={"ID":"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30","Type":"ContainerStarted","Data":"1f177aba4e6957989515bf58315e7f84082fdc10030ba9abe9bf516f7d61fc51"} Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.514245 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.515829 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b6b365e8-6c2a-41fe-b50a-1702144d67d4","Type":"ContainerStarted","Data":"60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d"} Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.517291 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ml66n" event={"ID":"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d","Type":"ContainerStarted","Data":"beb87d7c4d42b358a7b2c380c851f944dc0d5a8efb5eaf6f4fed99b0a0bf02b0"} Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.517317 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ml66n" event={"ID":"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d","Type":"ContainerStarted","Data":"00f03dbcfdfd4aad91ab2a471f65d0649c5c8aa3ac38f2004d1b6d193f931a05"} Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.570962 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-zxqkx" podStartSLOduration=7.570943048 podStartE2EDuration="7.570943048s" podCreationTimestamp="2025-12-04 17:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:30.550386256 +0000 UTC m=+1741.911460658" watchObservedRunningTime="2025-12-04 17:55:30.570943048 +0000 UTC m=+1741.932017460" Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.571585 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" podStartSLOduration=5.571578669 podStartE2EDuration="5.571578669s" podCreationTimestamp="2025-12-04 17:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:30.534670476 +0000 UTC m=+1741.895744878" watchObservedRunningTime="2025-12-04 17:55:30.571578669 +0000 UTC m=+1741.932653081" Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.577646 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-47ce-account-create-update-prrl6" podStartSLOduration=2.577630941 podStartE2EDuration="2.577630941s" podCreationTimestamp="2025-12-04 17:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:30.567280772 +0000 UTC m=+1741.928355174" watchObservedRunningTime="2025-12-04 17:55:30.577630941 +0000 UTC m=+1741.938705353" Dec 04 17:55:30 crc kubenswrapper[4948]: I1204 17:55:30.596002 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-ml66n" podStartSLOduration=2.595980854 podStartE2EDuration="2.595980854s" podCreationTimestamp="2025-12-04 17:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:55:30.580667187 +0000 UTC m=+1741.941741589" watchObservedRunningTime="2025-12-04 17:55:30.595980854 +0000 UTC m=+1741.957055246" Dec 04 17:55:31 crc kubenswrapper[4948]: I1204 17:55:31.531841 4948 generic.go:334] "Generic (PLEG): container finished" podID="66b5b4ee-a4ce-416c-8807-e5fe61c9c59d" containerID="beb87d7c4d42b358a7b2c380c851f944dc0d5a8efb5eaf6f4fed99b0a0bf02b0" exitCode=0 Dec 04 17:55:31 crc kubenswrapper[4948]: I1204 17:55:31.532268 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ml66n" event={"ID":"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d","Type":"ContainerDied","Data":"beb87d7c4d42b358a7b2c380c851f944dc0d5a8efb5eaf6f4fed99b0a0bf02b0"} Dec 04 17:55:31 crc kubenswrapper[4948]: I1204 17:55:31.537641 4948 generic.go:334] "Generic (PLEG): container finished" podID="a24421a3-5139-4a65-b91e-8915d1b96103" containerID="b6183c62bde6cb6fca075549c3d9d9e84661daae0b28f33907be13b8f3bc5e84" exitCode=0 Dec 04 17:55:31 crc kubenswrapper[4948]: I1204 17:55:31.537726 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-47ce-account-create-update-prrl6" event={"ID":"a24421a3-5139-4a65-b91e-8915d1b96103","Type":"ContainerDied","Data":"b6183c62bde6cb6fca075549c3d9d9e84661daae0b28f33907be13b8f3bc5e84"} Dec 04 17:55:31 crc kubenswrapper[4948]: I1204 17:55:31.542298 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b6b365e8-6c2a-41fe-b50a-1702144d67d4","Type":"ContainerStarted","Data":"27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205"} Dec 04 17:55:31 crc kubenswrapper[4948]: I1204 17:55:31.542337 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 17:55:31 crc kubenswrapper[4948]: I1204 17:55:31.606720 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=5.359033042 podStartE2EDuration="8.606696772s" podCreationTimestamp="2025-12-04 17:55:23 +0000 UTC" firstStartedPulling="2025-12-04 17:55:25.893009915 +0000 UTC m=+1737.254084317" lastFinishedPulling="2025-12-04 17:55:29.140673645 +0000 UTC m=+1740.501748047" observedRunningTime="2025-12-04 17:55:31.600520766 +0000 UTC m=+1742.961595168" watchObservedRunningTime="2025-12-04 17:55:31.606696772 +0000 UTC m=+1742.967771184" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:34.118004 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:34.118204 4948 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:34.118746 4948 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:34.118806 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift podName:6bc62dd5-67bd-4e26-bedb-58e1d56abac9 nodeName:}" failed. No retries permitted until 2025-12-04 17:55:42.118788262 +0000 UTC m=+1753.479862664 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift") pod "swift-storage-0" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9") : configmap "swift-ring-files" not found Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:35.090260 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:35.623385 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:35.699449 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zxqkx"] Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:35.699921 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-zxqkx" podUID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerName="dnsmasq-dns" containerID="cri-o://1f177aba4e6957989515bf58315e7f84082fdc10030ba9abe9bf516f7d61fc51" gracePeriod=10 Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:35.850965 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5536cf0a_e6a0_4932_88ae_6fb9f6dfbb30.slice/crio-1f177aba4e6957989515bf58315e7f84082fdc10030ba9abe9bf516f7d61fc51.scope\": RecentStats: unable to find data in memory cache]" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.530390 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ml66n" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.540156 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.598208 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc67m\" (UniqueName: \"kubernetes.io/projected/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-kube-api-access-fc67m\") pod \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\" (UID: \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\") " Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.598507 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-operator-scripts\") pod \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\" (UID: \"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d\") " Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.598530 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a24421a3-5139-4a65-b91e-8915d1b96103-operator-scripts\") pod \"a24421a3-5139-4a65-b91e-8915d1b96103\" (UID: \"a24421a3-5139-4a65-b91e-8915d1b96103\") " Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.598567 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h25b\" (UniqueName: \"kubernetes.io/projected/a24421a3-5139-4a65-b91e-8915d1b96103-kube-api-access-2h25b\") pod \"a24421a3-5139-4a65-b91e-8915d1b96103\" (UID: \"a24421a3-5139-4a65-b91e-8915d1b96103\") " Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.599453 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66b5b4ee-a4ce-416c-8807-e5fe61c9c59d" (UID: "66b5b4ee-a4ce-416c-8807-e5fe61c9c59d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.599933 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a24421a3-5139-4a65-b91e-8915d1b96103-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a24421a3-5139-4a65-b91e-8915d1b96103" (UID: "a24421a3-5139-4a65-b91e-8915d1b96103"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.603132 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-kube-api-access-fc67m" (OuterVolumeSpecName: "kube-api-access-fc67m") pod "66b5b4ee-a4ce-416c-8807-e5fe61c9c59d" (UID: "66b5b4ee-a4ce-416c-8807-e5fe61c9c59d"). InnerVolumeSpecName "kube-api-access-fc67m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.603177 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24421a3-5139-4a65-b91e-8915d1b96103-kube-api-access-2h25b" (OuterVolumeSpecName: "kube-api-access-2h25b") pod "a24421a3-5139-4a65-b91e-8915d1b96103" (UID: "a24421a3-5139-4a65-b91e-8915d1b96103"). InnerVolumeSpecName "kube-api-access-2h25b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.617767 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ml66n" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.617798 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ml66n" event={"ID":"66b5b4ee-a4ce-416c-8807-e5fe61c9c59d","Type":"ContainerDied","Data":"00f03dbcfdfd4aad91ab2a471f65d0649c5c8aa3ac38f2004d1b6d193f931a05"} Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.617841 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00f03dbcfdfd4aad91ab2a471f65d0649c5c8aa3ac38f2004d1b6d193f931a05" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.619701 4948 generic.go:334] "Generic (PLEG): container finished" podID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerID="1f177aba4e6957989515bf58315e7f84082fdc10030ba9abe9bf516f7d61fc51" exitCode=0 Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.619752 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zxqkx" event={"ID":"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30","Type":"ContainerDied","Data":"1f177aba4e6957989515bf58315e7f84082fdc10030ba9abe9bf516f7d61fc51"} Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.621651 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-47ce-account-create-update-prrl6" event={"ID":"a24421a3-5139-4a65-b91e-8915d1b96103","Type":"ContainerDied","Data":"001ea9963964cdc4d73bb1b451c07a993783909d953881b24f65d22fa0097d2b"} Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.621701 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="001ea9963964cdc4d73bb1b451c07a993783909d953881b24f65d22fa0097d2b" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.621757 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-47ce-account-create-update-prrl6" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.701342 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.701422 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a24421a3-5139-4a65-b91e-8915d1b96103-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.701442 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h25b\" (UniqueName: \"kubernetes.io/projected/a24421a3-5139-4a65-b91e-8915d1b96103-kube-api-access-2h25b\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:38.701461 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc67m\" (UniqueName: \"kubernetes.io/projected/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d-kube-api-access-fc67m\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:40.088063 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-zxqkx" podUID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:41.914582 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:41.915153 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:42.162494 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:42.162744 4948 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:42.162779 4948 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:42.162850 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift podName:6bc62dd5-67bd-4e26-bedb-58e1d56abac9 nodeName:}" failed. No retries permitted until 2025-12-04 17:55:58.162826421 +0000 UTC m=+1769.523900833 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift") pod "swift-storage-0" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9") : configmap "swift-ring-files" not found Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.598278 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jn67c"] Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:43.598623 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b875eb-4f81-407e-b0f1-12086316a557" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.598638 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b875eb-4f81-407e-b0f1-12086316a557" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:43.598656 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0913a9-d96e-404a-9ece-85dc07caad20" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.598664 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0913a9-d96e-404a-9ece-85dc07caad20" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:43.598681 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c87540-53a6-4923-adcb-3af20aa678d1" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.598689 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c87540-53a6-4923-adcb-3af20aa678d1" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:43.598889 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee248375-d52b-46cc-bef6-c6a53f95537e" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.598898 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee248375-d52b-46cc-bef6-c6a53f95537e" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:43.598918 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24421a3-5139-4a65-b91e-8915d1b96103" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.598925 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24421a3-5139-4a65-b91e-8915d1b96103" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: E1204 17:55:43.598936 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b5b4ee-a4ce-416c-8807-e5fe61c9c59d" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.598944 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b5b4ee-a4ce-416c-8807-e5fe61c9c59d" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.599153 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b875eb-4f81-407e-b0f1-12086316a557" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.599167 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0913a9-d96e-404a-9ece-85dc07caad20" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.599184 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee248375-d52b-46cc-bef6-c6a53f95537e" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.599195 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b5b4ee-a4ce-416c-8807-e5fe61c9c59d" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.599210 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c87540-53a6-4923-adcb-3af20aa678d1" containerName="mariadb-database-create" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.599222 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24421a3-5139-4a65-b91e-8915d1b96103" containerName="mariadb-account-create-update" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.599831 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.603611 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.603764 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bdk4p" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.608686 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jn67c"] Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.696351 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-config-data\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.696435 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl8fs\" (UniqueName: \"kubernetes.io/projected/55887774-d332-4083-8f3c-6281330114cd-kube-api-access-rl8fs\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.696467 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-db-sync-config-data\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.696486 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-combined-ca-bundle\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.798694 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-config-data\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.798786 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl8fs\" (UniqueName: \"kubernetes.io/projected/55887774-d332-4083-8f3c-6281330114cd-kube-api-access-rl8fs\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.798838 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-db-sync-config-data\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.798863 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-combined-ca-bundle\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.805788 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-db-sync-config-data\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.805930 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-combined-ca-bundle\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.814082 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-config-data\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.846950 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl8fs\" (UniqueName: \"kubernetes.io/projected/55887774-d332-4083-8f3c-6281330114cd-kube-api-access-rl8fs\") pod \"glance-db-sync-jn67c\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.857297 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 17:55:44 crc kubenswrapper[4948]: I1204 17:55:43.918554 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jn67c" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.083183 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.126577 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-dns-svc\") pod \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.126741 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-config\") pod \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.126912 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-sb\") pod \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.126974 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv6sj\" (UniqueName: \"kubernetes.io/projected/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-kube-api-access-qv6sj\") pod \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.127001 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-nb\") pod \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\" (UID: \"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30\") " Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.136868 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-kube-api-access-qv6sj" (OuterVolumeSpecName: "kube-api-access-qv6sj") pod "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" (UID: "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30"). InnerVolumeSpecName "kube-api-access-qv6sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.164964 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-config" (OuterVolumeSpecName: "config") pod "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" (UID: "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.172421 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" (UID: "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.172994 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" (UID: "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.183391 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" (UID: "5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.229944 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.229994 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv6sj\" (UniqueName: \"kubernetes.io/projected/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-kube-api-access-qv6sj\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.230005 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.230015 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.230026 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.450560 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jn67c"] Dec 04 17:55:45 crc kubenswrapper[4948]: W1204 17:55:45.460736 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55887774_d332_4083_8f3c_6281330114cd.slice/crio-6eac8ea76d56c376c91b9c35ad20489e0fafdbd54a3cf57328312e571a643ba0 WatchSource:0}: Error finding container 6eac8ea76d56c376c91b9c35ad20489e0fafdbd54a3cf57328312e571a643ba0: Status 404 returned error can't find the container with id 6eac8ea76d56c376c91b9c35ad20489e0fafdbd54a3cf57328312e571a643ba0 Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.676546 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jn67c" event={"ID":"55887774-d332-4083-8f3c-6281330114cd","Type":"ContainerStarted","Data":"6eac8ea76d56c376c91b9c35ad20489e0fafdbd54a3cf57328312e571a643ba0"} Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.678958 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-zxqkx" event={"ID":"5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30","Type":"ContainerDied","Data":"1c3f155cb50187b8d908383b7c6c967587f91315bc926aecfe8039e5e6ae880a"} Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.679025 4948 scope.go:117] "RemoveContainer" containerID="1f177aba4e6957989515bf58315e7f84082fdc10030ba9abe9bf516f7d61fc51" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.679203 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-zxqkx" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.681142 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2wcvz" event={"ID":"c74958a4-caed-4579-b0ff-cbabe46b09dd","Type":"ContainerStarted","Data":"41c55b55d495ef9b147a733c4d666ff5ede3c80eb031a735bf7deb9b73dcdf08"} Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.703202 4948 scope.go:117] "RemoveContainer" containerID="4459d752207e41a642608331152bc2124974354406bb5f2442e65c39b36eab1b" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.714642 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2wcvz" podStartSLOduration=3.733486445 podStartE2EDuration="19.714624747s" podCreationTimestamp="2025-12-04 17:55:26 +0000 UTC" firstStartedPulling="2025-12-04 17:55:28.881496104 +0000 UTC m=+1740.242570506" lastFinishedPulling="2025-12-04 17:55:44.862634406 +0000 UTC m=+1756.223708808" observedRunningTime="2025-12-04 17:55:45.710520346 +0000 UTC m=+1757.071594758" watchObservedRunningTime="2025-12-04 17:55:45.714624747 +0000 UTC m=+1757.075699149" Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.735795 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zxqkx"] Dec 04 17:55:45 crc kubenswrapper[4948]: I1204 17:55:45.746711 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-zxqkx"] Dec 04 17:55:46 crc kubenswrapper[4948]: I1204 17:55:46.948424 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" path="/var/lib/kubelet/pods/5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30/volumes" Dec 04 17:55:47 crc kubenswrapper[4948]: I1204 17:55:47.699162 4948 generic.go:334] "Generic (PLEG): container finished" podID="90b4baf7-8366-4f47-8515-c33e1b691856" containerID="82c901cf00202ab9ecd08dc4c09ede1d9fcdc9869bb58784238db83a9b10208f" exitCode=0 Dec 04 17:55:47 crc kubenswrapper[4948]: I1204 17:55:47.699255 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"90b4baf7-8366-4f47-8515-c33e1b691856","Type":"ContainerDied","Data":"82c901cf00202ab9ecd08dc4c09ede1d9fcdc9869bb58784238db83a9b10208f"} Dec 04 17:55:47 crc kubenswrapper[4948]: I1204 17:55:47.701846 4948 generic.go:334] "Generic (PLEG): container finished" podID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerID="0faca2eff6b0bcf6f0f9c1e986baf52aab23458cefa4976735633696f679414d" exitCode=0 Dec 04 17:55:47 crc kubenswrapper[4948]: I1204 17:55:47.701886 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b34ca165-31d6-44fa-b175-ed2b1bf9f766","Type":"ContainerDied","Data":"0faca2eff6b0bcf6f0f9c1e986baf52aab23458cefa4976735633696f679414d"} Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.714666 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"90b4baf7-8366-4f47-8515-c33e1b691856","Type":"ContainerStarted","Data":"ce3cf731c06ee83c40bae89c0c8e62893dd7be16f5ea71cde48d876fb17f3f41"} Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.716018 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.720557 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b34ca165-31d6-44fa-b175-ed2b1bf9f766","Type":"ContainerStarted","Data":"de019385e7338481198dc33686e0126bb41672f2effc6fd4c866ef06770f14f7"} Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.722126 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.751432 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.982753163 podStartE2EDuration="1m11.751411901s" podCreationTimestamp="2025-12-04 17:54:37 +0000 UTC" firstStartedPulling="2025-12-04 17:54:40.266174061 +0000 UTC m=+1691.627248463" lastFinishedPulling="2025-12-04 17:55:14.034832799 +0000 UTC m=+1725.395907201" observedRunningTime="2025-12-04 17:55:48.746442645 +0000 UTC m=+1760.107517057" watchObservedRunningTime="2025-12-04 17:55:48.751411901 +0000 UTC m=+1760.112486303" Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.774744 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.268156094 podStartE2EDuration="1m10.774731158s" podCreationTimestamp="2025-12-04 17:54:38 +0000 UTC" firstStartedPulling="2025-12-04 17:54:40.543438347 +0000 UTC m=+1691.904512749" lastFinishedPulling="2025-12-04 17:55:14.050013391 +0000 UTC m=+1725.411087813" observedRunningTime="2025-12-04 17:55:48.76936474 +0000 UTC m=+1760.130439152" watchObservedRunningTime="2025-12-04 17:55:48.774731158 +0000 UTC m=+1760.135805550" Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.785097 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-bd2ch" podUID="4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" containerName="ovn-controller" probeResult="failure" output=< Dec 04 17:55:48 crc kubenswrapper[4948]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 17:55:48 crc kubenswrapper[4948]: > Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.825316 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:55:48 crc kubenswrapper[4948]: I1204 17:55:48.840279 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.075442 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-bd2ch-config-j87xt"] Dec 04 17:55:49 crc kubenswrapper[4948]: E1204 17:55:49.075852 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerName="dnsmasq-dns" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.075874 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerName="dnsmasq-dns" Dec 04 17:55:49 crc kubenswrapper[4948]: E1204 17:55:49.075922 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerName="init" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.075931 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerName="init" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.076178 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="5536cf0a-e6a0-4932-88ae-6fb9f6dfbb30" containerName="dnsmasq-dns" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.076836 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.093381 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bd2ch-config-j87xt"] Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.096453 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.099589 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-additional-scripts\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.099653 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run-ovn\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.099716 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-log-ovn\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.103215 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-scripts\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.103277 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2ll\" (UniqueName: \"kubernetes.io/projected/407599ac-f3f3-431a-943b-85c17c754c46-kube-api-access-xz2ll\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.103307 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.204778 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-additional-scripts\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206095 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-additional-scripts\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206220 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run-ovn\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206317 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-log-ovn\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206549 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-log-ovn\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206566 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run-ovn\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206603 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-scripts\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206689 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2ll\" (UniqueName: \"kubernetes.io/projected/407599ac-f3f3-431a-943b-85c17c754c46-kube-api-access-xz2ll\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206728 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.206826 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.208564 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-scripts\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.253276 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2ll\" (UniqueName: \"kubernetes.io/projected/407599ac-f3f3-431a-943b-85c17c754c46-kube-api-access-xz2ll\") pod \"ovn-controller-bd2ch-config-j87xt\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.407489 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:49 crc kubenswrapper[4948]: I1204 17:55:49.903967 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-bd2ch-config-j87xt"] Dec 04 17:55:49 crc kubenswrapper[4948]: W1204 17:55:49.909178 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod407599ac_f3f3_431a_943b_85c17c754c46.slice/crio-2e63773cc30c7c8d59865b71f33b826cfe8e25a9e4b2d425dfa080aca819f174 WatchSource:0}: Error finding container 2e63773cc30c7c8d59865b71f33b826cfe8e25a9e4b2d425dfa080aca819f174: Status 404 returned error can't find the container with id 2e63773cc30c7c8d59865b71f33b826cfe8e25a9e4b2d425dfa080aca819f174 Dec 04 17:55:50 crc kubenswrapper[4948]: I1204 17:55:50.758909 4948 generic.go:334] "Generic (PLEG): container finished" podID="407599ac-f3f3-431a-943b-85c17c754c46" containerID="579dd03102cd4a05a136c774a013c1043b2a2531695e724eadbbbb097e630ecf" exitCode=0 Dec 04 17:55:50 crc kubenswrapper[4948]: I1204 17:55:50.759081 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bd2ch-config-j87xt" event={"ID":"407599ac-f3f3-431a-943b-85c17c754c46","Type":"ContainerDied","Data":"579dd03102cd4a05a136c774a013c1043b2a2531695e724eadbbbb097e630ecf"} Dec 04 17:55:50 crc kubenswrapper[4948]: I1204 17:55:50.759202 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bd2ch-config-j87xt" event={"ID":"407599ac-f3f3-431a-943b-85c17c754c46","Type":"ContainerStarted","Data":"2e63773cc30c7c8d59865b71f33b826cfe8e25a9e4b2d425dfa080aca819f174"} Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.149910 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.293923 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run-ovn\") pod \"407599ac-f3f3-431a-943b-85c17c754c46\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.293965 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-log-ovn\") pod \"407599ac-f3f3-431a-943b-85c17c754c46\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.293983 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run\") pod \"407599ac-f3f3-431a-943b-85c17c754c46\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294062 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "407599ac-f3f3-431a-943b-85c17c754c46" (UID: "407599ac-f3f3-431a-943b-85c17c754c46"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294075 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "407599ac-f3f3-431a-943b-85c17c754c46" (UID: "407599ac-f3f3-431a-943b-85c17c754c46"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294116 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-additional-scripts\") pod \"407599ac-f3f3-431a-943b-85c17c754c46\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294199 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-scripts\") pod \"407599ac-f3f3-431a-943b-85c17c754c46\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294221 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2ll\" (UniqueName: \"kubernetes.io/projected/407599ac-f3f3-431a-943b-85c17c754c46-kube-api-access-xz2ll\") pod \"407599ac-f3f3-431a-943b-85c17c754c46\" (UID: \"407599ac-f3f3-431a-943b-85c17c754c46\") " Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294215 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run" (OuterVolumeSpecName: "var-run") pod "407599ac-f3f3-431a-943b-85c17c754c46" (UID: "407599ac-f3f3-431a-943b-85c17c754c46"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294736 4948 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294754 4948 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294763 4948 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/407599ac-f3f3-431a-943b-85c17c754c46-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.294912 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "407599ac-f3f3-431a-943b-85c17c754c46" (UID: "407599ac-f3f3-431a-943b-85c17c754c46"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.295316 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-scripts" (OuterVolumeSpecName: "scripts") pod "407599ac-f3f3-431a-943b-85c17c754c46" (UID: "407599ac-f3f3-431a-943b-85c17c754c46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.302736 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/407599ac-f3f3-431a-943b-85c17c754c46-kube-api-access-xz2ll" (OuterVolumeSpecName: "kube-api-access-xz2ll") pod "407599ac-f3f3-431a-943b-85c17c754c46" (UID: "407599ac-f3f3-431a-943b-85c17c754c46"). InnerVolumeSpecName "kube-api-access-xz2ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.396249 4948 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.396657 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/407599ac-f3f3-431a-943b-85c17c754c46-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.396670 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2ll\" (UniqueName: \"kubernetes.io/projected/407599ac-f3f3-431a-943b-85c17c754c46-kube-api-access-xz2ll\") on node \"crc\" DevicePath \"\"" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.774745 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bd2ch-config-j87xt" event={"ID":"407599ac-f3f3-431a-943b-85c17c754c46","Type":"ContainerDied","Data":"2e63773cc30c7c8d59865b71f33b826cfe8e25a9e4b2d425dfa080aca819f174"} Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.774817 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e63773cc30c7c8d59865b71f33b826cfe8e25a9e4b2d425dfa080aca819f174" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.774924 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bd2ch-config-j87xt" Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.779647 4948 generic.go:334] "Generic (PLEG): container finished" podID="c74958a4-caed-4579-b0ff-cbabe46b09dd" containerID="41c55b55d495ef9b147a733c4d666ff5ede3c80eb031a735bf7deb9b73dcdf08" exitCode=0 Dec 04 17:55:52 crc kubenswrapper[4948]: I1204 17:55:52.779709 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2wcvz" event={"ID":"c74958a4-caed-4579-b0ff-cbabe46b09dd","Type":"ContainerDied","Data":"41c55b55d495ef9b147a733c4d666ff5ede3c80eb031a735bf7deb9b73dcdf08"} Dec 04 17:55:53 crc kubenswrapper[4948]: I1204 17:55:53.253983 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bd2ch-config-j87xt"] Dec 04 17:55:53 crc kubenswrapper[4948]: I1204 17:55:53.259939 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bd2ch-config-j87xt"] Dec 04 17:55:53 crc kubenswrapper[4948]: I1204 17:55:53.772943 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-bd2ch" Dec 04 17:55:53 crc kubenswrapper[4948]: I1204 17:55:53.913906 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:55:53 crc kubenswrapper[4948]: E1204 17:55:53.914291 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:55:54 crc kubenswrapper[4948]: I1204 17:55:54.923263 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="407599ac-f3f3-431a-943b-85c17c754c46" path="/var/lib/kubelet/pods/407599ac-f3f3-431a-943b-85c17c754c46/volumes" Dec 04 17:55:58 crc kubenswrapper[4948]: I1204 17:55:58.212369 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:58 crc kubenswrapper[4948]: I1204 17:55:58.220689 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"swift-storage-0\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " pod="openstack/swift-storage-0" Dec 04 17:55:58 crc kubenswrapper[4948]: I1204 17:55:58.280338 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 17:55:59 crc kubenswrapper[4948]: I1204 17:55:59.730215 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Dec 04 17:56:00 crc kubenswrapper[4948]: I1204 17:56:00.029445 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Dec 04 17:56:06 crc kubenswrapper[4948]: I1204 17:56:06.914325 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:56:06 crc kubenswrapper[4948]: E1204 17:56:06.915253 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:56:09 crc kubenswrapper[4948]: I1204 17:56:09.728960 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.027592 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.167286 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:56:10 crc kubenswrapper[4948]: E1204 17:56:10.224505 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 04 17:56:10 crc kubenswrapper[4948]: E1204 17:56:10.224671 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rl8fs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-jn67c_openstack(55887774-d332-4083-8f3c-6281330114cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.224760 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-dispersionconf\") pod \"c74958a4-caed-4579-b0ff-cbabe46b09dd\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.225017 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-scripts\") pod \"c74958a4-caed-4579-b0ff-cbabe46b09dd\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.225081 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c74958a4-caed-4579-b0ff-cbabe46b09dd-etc-swift\") pod \"c74958a4-caed-4579-b0ff-cbabe46b09dd\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.225207 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-ring-data-devices\") pod \"c74958a4-caed-4579-b0ff-cbabe46b09dd\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.225231 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-swiftconf\") pod \"c74958a4-caed-4579-b0ff-cbabe46b09dd\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.225260 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd8wd\" (UniqueName: \"kubernetes.io/projected/c74958a4-caed-4579-b0ff-cbabe46b09dd-kube-api-access-fd8wd\") pod \"c74958a4-caed-4579-b0ff-cbabe46b09dd\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.225298 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-combined-ca-bundle\") pod \"c74958a4-caed-4579-b0ff-cbabe46b09dd\" (UID: \"c74958a4-caed-4579-b0ff-cbabe46b09dd\") " Dec 04 17:56:10 crc kubenswrapper[4948]: E1204 17:56:10.226436 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-jn67c" podUID="55887774-d332-4083-8f3c-6281330114cd" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.227336 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c74958a4-caed-4579-b0ff-cbabe46b09dd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c74958a4-caed-4579-b0ff-cbabe46b09dd" (UID: "c74958a4-caed-4579-b0ff-cbabe46b09dd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.228721 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c74958a4-caed-4579-b0ff-cbabe46b09dd" (UID: "c74958a4-caed-4579-b0ff-cbabe46b09dd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.240357 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74958a4-caed-4579-b0ff-cbabe46b09dd-kube-api-access-fd8wd" (OuterVolumeSpecName: "kube-api-access-fd8wd") pod "c74958a4-caed-4579-b0ff-cbabe46b09dd" (UID: "c74958a4-caed-4579-b0ff-cbabe46b09dd"). InnerVolumeSpecName "kube-api-access-fd8wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.257960 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c74958a4-caed-4579-b0ff-cbabe46b09dd" (UID: "c74958a4-caed-4579-b0ff-cbabe46b09dd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.270465 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c74958a4-caed-4579-b0ff-cbabe46b09dd" (UID: "c74958a4-caed-4579-b0ff-cbabe46b09dd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.272784 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-scripts" (OuterVolumeSpecName: "scripts") pod "c74958a4-caed-4579-b0ff-cbabe46b09dd" (UID: "c74958a4-caed-4579-b0ff-cbabe46b09dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.279388 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c74958a4-caed-4579-b0ff-cbabe46b09dd" (UID: "c74958a4-caed-4579-b0ff-cbabe46b09dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.327176 4948 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.327239 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.327249 4948 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c74958a4-caed-4579-b0ff-cbabe46b09dd-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.327258 4948 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c74958a4-caed-4579-b0ff-cbabe46b09dd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.327269 4948 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.327279 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd8wd\" (UniqueName: \"kubernetes.io/projected/c74958a4-caed-4579-b0ff-cbabe46b09dd-kube-api-access-fd8wd\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.327289 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c74958a4-caed-4579-b0ff-cbabe46b09dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.621528 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.943601 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"09d33a5cf80f62c8d95761a44c450d2dfb78eb56feb8959b358ccdffeb6f8f27"} Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.945789 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2wcvz" Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.945778 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2wcvz" event={"ID":"c74958a4-caed-4579-b0ff-cbabe46b09dd","Type":"ContainerDied","Data":"5fba489d3bdcd576861b11e62ba7976f504dbee1494a255b54e7242589ecd2b2"} Dec 04 17:56:10 crc kubenswrapper[4948]: I1204 17:56:10.945948 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fba489d3bdcd576861b11e62ba7976f504dbee1494a255b54e7242589ecd2b2" Dec 04 17:56:10 crc kubenswrapper[4948]: E1204 17:56:10.946845 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-jn67c" podUID="55887774-d332-4083-8f3c-6281330114cd" Dec 04 17:56:13 crc kubenswrapper[4948]: I1204 17:56:13.972528 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069"} Dec 04 17:56:13 crc kubenswrapper[4948]: I1204 17:56:13.973188 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3"} Dec 04 17:56:13 crc kubenswrapper[4948]: I1204 17:56:13.973206 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5"} Dec 04 17:56:13 crc kubenswrapper[4948]: I1204 17:56:13.973218 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a"} Dec 04 17:56:16 crc kubenswrapper[4948]: I1204 17:56:16.997430 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a"} Dec 04 17:56:16 crc kubenswrapper[4948]: I1204 17:56:16.997994 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77"} Dec 04 17:56:16 crc kubenswrapper[4948]: I1204 17:56:16.998057 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430"} Dec 04 17:56:16 crc kubenswrapper[4948]: I1204 17:56:16.998071 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c"} Dec 04 17:56:17 crc kubenswrapper[4948]: I1204 17:56:17.913426 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:56:17 crc kubenswrapper[4948]: E1204 17:56:17.913835 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:56:18 crc kubenswrapper[4948]: I1204 17:56:18.010554 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528"} Dec 04 17:56:19 crc kubenswrapper[4948]: I1204 17:56:19.023614 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4"} Dec 04 17:56:19 crc kubenswrapper[4948]: I1204 17:56:19.023927 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd"} Dec 04 17:56:19 crc kubenswrapper[4948]: I1204 17:56:19.023944 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16"} Dec 04 17:56:19 crc kubenswrapper[4948]: I1204 17:56:19.023960 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c"} Dec 04 17:56:19 crc kubenswrapper[4948]: I1204 17:56:19.729402 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.029245 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.043091 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d"} Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.043142 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerStarted","Data":"a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347"} Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.065135 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tmvtp"] Dec 04 17:56:20 crc kubenswrapper[4948]: E1204 17:56:20.065530 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74958a4-caed-4579-b0ff-cbabe46b09dd" containerName="swift-ring-rebalance" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.065552 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74958a4-caed-4579-b0ff-cbabe46b09dd" containerName="swift-ring-rebalance" Dec 04 17:56:20 crc kubenswrapper[4948]: E1204 17:56:20.065587 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407599ac-f3f3-431a-943b-85c17c754c46" containerName="ovn-config" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.065597 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="407599ac-f3f3-431a-943b-85c17c754c46" containerName="ovn-config" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.065777 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="407599ac-f3f3-431a-943b-85c17c754c46" containerName="ovn-config" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.065827 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74958a4-caed-4579-b0ff-cbabe46b09dd" containerName="swift-ring-rebalance" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.071482 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.104624 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tmvtp"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.140482 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5cjp\" (UniqueName: \"kubernetes.io/projected/baa21dba-653c-4cec-9ecd-09a6e1dfa082-kube-api-access-k5cjp\") pod \"cinder-db-create-tmvtp\" (UID: \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\") " pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.140596 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa21dba-653c-4cec-9ecd-09a6e1dfa082-operator-scripts\") pod \"cinder-db-create-tmvtp\" (UID: \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\") " pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.153922 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=48.093610819 podStartE2EDuration="55.153896565s" podCreationTimestamp="2025-12-04 17:55:25 +0000 UTC" firstStartedPulling="2025-12-04 17:56:10.626500574 +0000 UTC m=+1781.987574996" lastFinishedPulling="2025-12-04 17:56:17.68678635 +0000 UTC m=+1789.047860742" observedRunningTime="2025-12-04 17:56:20.148713442 +0000 UTC m=+1791.509787844" watchObservedRunningTime="2025-12-04 17:56:20.153896565 +0000 UTC m=+1791.514970967" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.178287 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-hqqtd"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.179679 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.183829 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-aee8-account-create-update-cllhj"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.185204 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.186686 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.199887 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aee8-account-create-update-cllhj"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.215485 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-hqqtd"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.241919 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5cjp\" (UniqueName: \"kubernetes.io/projected/baa21dba-653c-4cec-9ecd-09a6e1dfa082-kube-api-access-k5cjp\") pod \"cinder-db-create-tmvtp\" (UID: \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\") " pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.242016 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa21dba-653c-4cec-9ecd-09a6e1dfa082-operator-scripts\") pod \"cinder-db-create-tmvtp\" (UID: \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\") " pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.242699 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa21dba-653c-4cec-9ecd-09a6e1dfa082-operator-scripts\") pod \"cinder-db-create-tmvtp\" (UID: \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\") " pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.274853 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5cjp\" (UniqueName: \"kubernetes.io/projected/baa21dba-653c-4cec-9ecd-09a6e1dfa082-kube-api-access-k5cjp\") pod \"cinder-db-create-tmvtp\" (UID: \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\") " pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.285289 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-20a5-account-create-update-9x655"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.286291 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.289351 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.306143 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-20a5-account-create-update-9x655"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.343599 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8td\" (UniqueName: \"kubernetes.io/projected/63125130-8f44-4d42-8fa9-2631c2c3d8ec-kube-api-access-fc8td\") pod \"barbican-aee8-account-create-update-cllhj\" (UID: \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\") " pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.344020 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-operator-scripts\") pod \"barbican-db-create-hqqtd\" (UID: \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\") " pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.344109 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdlkz\" (UniqueName: \"kubernetes.io/projected/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-kube-api-access-qdlkz\") pod \"barbican-db-create-hqqtd\" (UID: \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\") " pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.344178 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63125130-8f44-4d42-8fa9-2631c2c3d8ec-operator-scripts\") pod \"barbican-aee8-account-create-update-cllhj\" (UID: \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\") " pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.396428 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.449207 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-operator-scripts\") pod \"barbican-db-create-hqqtd\" (UID: \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\") " pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.449285 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdlkz\" (UniqueName: \"kubernetes.io/projected/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-kube-api-access-qdlkz\") pod \"barbican-db-create-hqqtd\" (UID: \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\") " pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.449337 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63125130-8f44-4d42-8fa9-2631c2c3d8ec-operator-scripts\") pod \"barbican-aee8-account-create-update-cllhj\" (UID: \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\") " pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.449377 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v86tq\" (UniqueName: \"kubernetes.io/projected/c2d49c2c-6474-4667-ba8c-21c2a24e4522-kube-api-access-v86tq\") pod \"cinder-20a5-account-create-update-9x655\" (UID: \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\") " pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.449436 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d49c2c-6474-4667-ba8c-21c2a24e4522-operator-scripts\") pod \"cinder-20a5-account-create-update-9x655\" (UID: \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\") " pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.449473 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8td\" (UniqueName: \"kubernetes.io/projected/63125130-8f44-4d42-8fa9-2631c2c3d8ec-kube-api-access-fc8td\") pod \"barbican-aee8-account-create-update-cllhj\" (UID: \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\") " pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.450183 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-operator-scripts\") pod \"barbican-db-create-hqqtd\" (UID: \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\") " pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.450947 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63125130-8f44-4d42-8fa9-2631c2c3d8ec-operator-scripts\") pod \"barbican-aee8-account-create-update-cllhj\" (UID: \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\") " pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.452419 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-p84jl"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.453972 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.461488 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.471264 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-p84jl"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.484400 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdlkz\" (UniqueName: \"kubernetes.io/projected/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-kube-api-access-qdlkz\") pod \"barbican-db-create-hqqtd\" (UID: \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\") " pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.494966 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8td\" (UniqueName: \"kubernetes.io/projected/63125130-8f44-4d42-8fa9-2631c2c3d8ec-kube-api-access-fc8td\") pod \"barbican-aee8-account-create-update-cllhj\" (UID: \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\") " pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.507246 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.512587 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.556619 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6kbvb"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.557694 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.558862 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d49c2c-6474-4667-ba8c-21c2a24e4522-operator-scripts\") pod \"cinder-20a5-account-create-update-9x655\" (UID: \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\") " pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.558907 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.558927 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9ggw\" (UniqueName: \"kubernetes.io/projected/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-kube-api-access-v9ggw\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.558978 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.559001 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.559021 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-config\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.559179 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.559229 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v86tq\" (UniqueName: \"kubernetes.io/projected/c2d49c2c-6474-4667-ba8c-21c2a24e4522-kube-api-access-v86tq\") pod \"cinder-20a5-account-create-update-9x655\" (UID: \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\") " pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.560324 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d49c2c-6474-4667-ba8c-21c2a24e4522-operator-scripts\") pod \"cinder-20a5-account-create-update-9x655\" (UID: \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\") " pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.577181 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2v5d6"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.578397 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.583691 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.586200 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ltmz5" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.586332 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.586375 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.595276 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4bb1-account-create-update-lzs7x"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.599238 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.604686 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v86tq\" (UniqueName: \"kubernetes.io/projected/c2d49c2c-6474-4667-ba8c-21c2a24e4522-kube-api-access-v86tq\") pod \"cinder-20a5-account-create-update-9x655\" (UID: \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\") " pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.615315 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.618091 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2v5d6"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.628881 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4bb1-account-create-update-lzs7x"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.633185 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.636457 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6kbvb"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.661820 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9ggw\" (UniqueName: \"kubernetes.io/projected/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-kube-api-access-v9ggw\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.661866 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trn9h\" (UniqueName: \"kubernetes.io/projected/145f54f7-b50a-4d77-8152-5d8986faa646-kube-api-access-trn9h\") pod \"neutron-db-create-6kbvb\" (UID: \"145f54f7-b50a-4d77-8152-5d8986faa646\") " pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.661899 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.661921 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-combined-ca-bundle\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.661936 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.661955 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-config\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.661975 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hvqj\" (UniqueName: \"kubernetes.io/projected/c7d77e21-3036-4810-98ec-1a44a3f882df-kube-api-access-6hvqj\") pod \"neutron-4bb1-account-create-update-lzs7x\" (UID: \"c7d77e21-3036-4810-98ec-1a44a3f882df\") " pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.661992 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145f54f7-b50a-4d77-8152-5d8986faa646-operator-scripts\") pod \"neutron-db-create-6kbvb\" (UID: \"145f54f7-b50a-4d77-8152-5d8986faa646\") " pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.662018 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.662077 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cndm\" (UniqueName: \"kubernetes.io/projected/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-kube-api-access-7cndm\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.662099 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-config-data\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.662128 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d77e21-3036-4810-98ec-1a44a3f882df-operator-scripts\") pod \"neutron-4bb1-account-create-update-lzs7x\" (UID: \"c7d77e21-3036-4810-98ec-1a44a3f882df\") " pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.662165 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.663880 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.664101 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.665852 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.666997 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-config\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.680480 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.696247 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9ggw\" (UniqueName: \"kubernetes.io/projected/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-kube-api-access-v9ggw\") pod \"dnsmasq-dns-5c79d794d7-p84jl\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.763576 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-config-data\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.763648 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d77e21-3036-4810-98ec-1a44a3f882df-operator-scripts\") pod \"neutron-4bb1-account-create-update-lzs7x\" (UID: \"c7d77e21-3036-4810-98ec-1a44a3f882df\") " pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.763725 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trn9h\" (UniqueName: \"kubernetes.io/projected/145f54f7-b50a-4d77-8152-5d8986faa646-kube-api-access-trn9h\") pod \"neutron-db-create-6kbvb\" (UID: \"145f54f7-b50a-4d77-8152-5d8986faa646\") " pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.763772 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-combined-ca-bundle\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.763809 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hvqj\" (UniqueName: \"kubernetes.io/projected/c7d77e21-3036-4810-98ec-1a44a3f882df-kube-api-access-6hvqj\") pod \"neutron-4bb1-account-create-update-lzs7x\" (UID: \"c7d77e21-3036-4810-98ec-1a44a3f882df\") " pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.763835 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145f54f7-b50a-4d77-8152-5d8986faa646-operator-scripts\") pod \"neutron-db-create-6kbvb\" (UID: \"145f54f7-b50a-4d77-8152-5d8986faa646\") " pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.763898 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cndm\" (UniqueName: \"kubernetes.io/projected/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-kube-api-access-7cndm\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.765996 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d77e21-3036-4810-98ec-1a44a3f882df-operator-scripts\") pod \"neutron-4bb1-account-create-update-lzs7x\" (UID: \"c7d77e21-3036-4810-98ec-1a44a3f882df\") " pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.768664 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145f54f7-b50a-4d77-8152-5d8986faa646-operator-scripts\") pod \"neutron-db-create-6kbvb\" (UID: \"145f54f7-b50a-4d77-8152-5d8986faa646\") " pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.769387 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-combined-ca-bundle\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.778658 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-config-data\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.783725 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trn9h\" (UniqueName: \"kubernetes.io/projected/145f54f7-b50a-4d77-8152-5d8986faa646-kube-api-access-trn9h\") pod \"neutron-db-create-6kbvb\" (UID: \"145f54f7-b50a-4d77-8152-5d8986faa646\") " pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.798157 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cndm\" (UniqueName: \"kubernetes.io/projected/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-kube-api-access-7cndm\") pod \"keystone-db-sync-2v5d6\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.809732 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hvqj\" (UniqueName: \"kubernetes.io/projected/c7d77e21-3036-4810-98ec-1a44a3f882df-kube-api-access-6hvqj\") pod \"neutron-4bb1-account-create-update-lzs7x\" (UID: \"c7d77e21-3036-4810-98ec-1a44a3f882df\") " pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.913270 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.941144 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.942733 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.952733 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-hqqtd"] Dec 04 17:56:20 crc kubenswrapper[4948]: I1204 17:56:20.961294 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:21 crc kubenswrapper[4948]: I1204 17:56:21.042941 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tmvtp"] Dec 04 17:56:21 crc kubenswrapper[4948]: I1204 17:56:21.054600 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hqqtd" event={"ID":"505f7a05-8fe4-4e76-b5ed-45339ebda3dc","Type":"ContainerStarted","Data":"c3f82125070c610bce9280ae40ce81d852574b68adcdc6071e939f00ccc44361"} Dec 04 17:56:21 crc kubenswrapper[4948]: I1204 17:56:21.061279 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aee8-account-create-update-cllhj"] Dec 04 17:56:21 crc kubenswrapper[4948]: W1204 17:56:21.066833 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63125130_8f44_4d42_8fa9_2631c2c3d8ec.slice/crio-8ef2afb98eb44c9d06ac460f838f69eb3f44569f9bf82034b90ef67f1f7c0e51 WatchSource:0}: Error finding container 8ef2afb98eb44c9d06ac460f838f69eb3f44569f9bf82034b90ef67f1f7c0e51: Status 404 returned error can't find the container with id 8ef2afb98eb44c9d06ac460f838f69eb3f44569f9bf82034b90ef67f1f7c0e51 Dec 04 17:56:21 crc kubenswrapper[4948]: W1204 17:56:21.068023 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaa21dba_653c_4cec_9ecd_09a6e1dfa082.slice/crio-1a4b68fb83e810176e0e42a8979437989a4961f12715182a3ad21e272a620b9b WatchSource:0}: Error finding container 1a4b68fb83e810176e0e42a8979437989a4961f12715182a3ad21e272a620b9b: Status 404 returned error can't find the container with id 1a4b68fb83e810176e0e42a8979437989a4961f12715182a3ad21e272a620b9b Dec 04 17:56:21 crc kubenswrapper[4948]: I1204 17:56:21.217175 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-20a5-account-create-update-9x655"] Dec 04 17:56:21 crc kubenswrapper[4948]: I1204 17:56:21.300394 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6kbvb"] Dec 04 17:56:21 crc kubenswrapper[4948]: I1204 17:56:21.401034 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-p84jl"] Dec 04 17:56:21 crc kubenswrapper[4948]: W1204 17:56:21.405671 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ffb4d75_52dc_4cf7_90a9_7577b5dea591.slice/crio-63870ffa76e8a629eeb58392a183657894ad80203bed60421a58a4c7a0e2e4b6 WatchSource:0}: Error finding container 63870ffa76e8a629eeb58392a183657894ad80203bed60421a58a4c7a0e2e4b6: Status 404 returned error can't find the container with id 63870ffa76e8a629eeb58392a183657894ad80203bed60421a58a4c7a0e2e4b6 Dec 04 17:56:21 crc kubenswrapper[4948]: I1204 17:56:21.477647 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2v5d6"] Dec 04 17:56:21 crc kubenswrapper[4948]: W1204 17:56:21.553361 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad839bcb_16b3_4321_8cf7_5e698ea7b32d.slice/crio-334d8028f5b0024a2f97481cc67c11640649f0116aff4cd6589854a906501e9e WatchSource:0}: Error finding container 334d8028f5b0024a2f97481cc67c11640649f0116aff4cd6589854a906501e9e: Status 404 returned error can't find the container with id 334d8028f5b0024a2f97481cc67c11640649f0116aff4cd6589854a906501e9e Dec 04 17:56:21 crc kubenswrapper[4948]: I1204 17:56:21.568426 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4bb1-account-create-update-lzs7x"] Dec 04 17:56:21 crc kubenswrapper[4948]: W1204 17:56:21.583214 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7d77e21_3036_4810_98ec_1a44a3f882df.slice/crio-988584cb95dc037238ea81d44d405cb1e16728428c276dc8ac4135a16ec6df1c WatchSource:0}: Error finding container 988584cb95dc037238ea81d44d405cb1e16728428c276dc8ac4135a16ec6df1c: Status 404 returned error can't find the container with id 988584cb95dc037238ea81d44d405cb1e16728428c276dc8ac4135a16ec6df1c Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.064016 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bb1-account-create-update-lzs7x" event={"ID":"c7d77e21-3036-4810-98ec-1a44a3f882df","Type":"ContainerStarted","Data":"9593921e030453b187f6e607d5dbc767989b02108eb3b864ff35cfd587ce4f9a"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.064333 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bb1-account-create-update-lzs7x" event={"ID":"c7d77e21-3036-4810-98ec-1a44a3f882df","Type":"ContainerStarted","Data":"988584cb95dc037238ea81d44d405cb1e16728428c276dc8ac4135a16ec6df1c"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.073880 4948 generic.go:334] "Generic (PLEG): container finished" podID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerID="34d55ac9e6ea4ab06758c3e1d7eb7a3ceeb54e98b523e9ae77ca0a5cf205fa8a" exitCode=0 Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.074008 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" event={"ID":"9ffb4d75-52dc-4cf7-90a9-7577b5dea591","Type":"ContainerDied","Data":"34d55ac9e6ea4ab06758c3e1d7eb7a3ceeb54e98b523e9ae77ca0a5cf205fa8a"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.074056 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" event={"ID":"9ffb4d75-52dc-4cf7-90a9-7577b5dea591","Type":"ContainerStarted","Data":"63870ffa76e8a629eeb58392a183657894ad80203bed60421a58a4c7a0e2e4b6"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.076753 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2v5d6" event={"ID":"ad839bcb-16b3-4321-8cf7-5e698ea7b32d","Type":"ContainerStarted","Data":"334d8028f5b0024a2f97481cc67c11640649f0116aff4cd6589854a906501e9e"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.084949 4948 generic.go:334] "Generic (PLEG): container finished" podID="baa21dba-653c-4cec-9ecd-09a6e1dfa082" containerID="bb1acd9ac03f710a41e03bf6f8bb7f5222fb1c46ba1a6e8e77781d9d9c3dd560" exitCode=0 Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.085071 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tmvtp" event={"ID":"baa21dba-653c-4cec-9ecd-09a6e1dfa082","Type":"ContainerDied","Data":"bb1acd9ac03f710a41e03bf6f8bb7f5222fb1c46ba1a6e8e77781d9d9c3dd560"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.085110 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tmvtp" event={"ID":"baa21dba-653c-4cec-9ecd-09a6e1dfa082","Type":"ContainerStarted","Data":"1a4b68fb83e810176e0e42a8979437989a4961f12715182a3ad21e272a620b9b"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.086121 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-4bb1-account-create-update-lzs7x" podStartSLOduration=2.086098144 podStartE2EDuration="2.086098144s" podCreationTimestamp="2025-12-04 17:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:56:22.081904321 +0000 UTC m=+1793.442978723" watchObservedRunningTime="2025-12-04 17:56:22.086098144 +0000 UTC m=+1793.447172546" Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.092917 4948 generic.go:334] "Generic (PLEG): container finished" podID="505f7a05-8fe4-4e76-b5ed-45339ebda3dc" containerID="fd03367607ea4bd510783f1e087daf032195a481e4a39c53b6a6ee638bdff39b" exitCode=0 Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.092983 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hqqtd" event={"ID":"505f7a05-8fe4-4e76-b5ed-45339ebda3dc","Type":"ContainerDied","Data":"fd03367607ea4bd510783f1e087daf032195a481e4a39c53b6a6ee638bdff39b"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.097868 4948 generic.go:334] "Generic (PLEG): container finished" podID="63125130-8f44-4d42-8fa9-2631c2c3d8ec" containerID="27d9e1e5f2ef25fdf36a94ebfb879f2251655923eb85fc4f49b2a22dcc40fe16" exitCode=0 Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.097943 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aee8-account-create-update-cllhj" event={"ID":"63125130-8f44-4d42-8fa9-2631c2c3d8ec","Type":"ContainerDied","Data":"27d9e1e5f2ef25fdf36a94ebfb879f2251655923eb85fc4f49b2a22dcc40fe16"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.097977 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aee8-account-create-update-cllhj" event={"ID":"63125130-8f44-4d42-8fa9-2631c2c3d8ec","Type":"ContainerStarted","Data":"8ef2afb98eb44c9d06ac460f838f69eb3f44569f9bf82034b90ef67f1f7c0e51"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.125805 4948 generic.go:334] "Generic (PLEG): container finished" podID="145f54f7-b50a-4d77-8152-5d8986faa646" containerID="584000ffffd365ef10e159d6f513183031557b7f69cda1709b0016cea7426d99" exitCode=0 Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.125866 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6kbvb" event={"ID":"145f54f7-b50a-4d77-8152-5d8986faa646","Type":"ContainerDied","Data":"584000ffffd365ef10e159d6f513183031557b7f69cda1709b0016cea7426d99"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.125889 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6kbvb" event={"ID":"145f54f7-b50a-4d77-8152-5d8986faa646","Type":"ContainerStarted","Data":"f1dcff9e831ba8c4b60a465997aebd219348069e1f5bd67c3eab4469ebee6c5d"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.127892 4948 generic.go:334] "Generic (PLEG): container finished" podID="c2d49c2c-6474-4667-ba8c-21c2a24e4522" containerID="31c1e49ead72861127107f90ce0fa37d7e78f909caae6944d8fec5ea338bd72e" exitCode=0 Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.127949 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-20a5-account-create-update-9x655" event={"ID":"c2d49c2c-6474-4667-ba8c-21c2a24e4522","Type":"ContainerDied","Data":"31c1e49ead72861127107f90ce0fa37d7e78f909caae6944d8fec5ea338bd72e"} Dec 04 17:56:22 crc kubenswrapper[4948]: I1204 17:56:22.127982 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-20a5-account-create-update-9x655" event={"ID":"c2d49c2c-6474-4667-ba8c-21c2a24e4522","Type":"ContainerStarted","Data":"b29503e398fa820a32e88b205153e98e29bfafe5141a9a8663f9fb4e145a73ff"} Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.136504 4948 generic.go:334] "Generic (PLEG): container finished" podID="c7d77e21-3036-4810-98ec-1a44a3f882df" containerID="9593921e030453b187f6e607d5dbc767989b02108eb3b864ff35cfd587ce4f9a" exitCode=0 Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.136805 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bb1-account-create-update-lzs7x" event={"ID":"c7d77e21-3036-4810-98ec-1a44a3f882df","Type":"ContainerDied","Data":"9593921e030453b187f6e607d5dbc767989b02108eb3b864ff35cfd587ce4f9a"} Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.139564 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" event={"ID":"9ffb4d75-52dc-4cf7-90a9-7577b5dea591","Type":"ContainerStarted","Data":"b55807d6a1c99dcbcfb7a907c49525bc4b6ce1f050dd6a8f6afa9fe3a1b59140"} Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.139809 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.185036 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" podStartSLOduration=3.185011193 podStartE2EDuration="3.185011193s" podCreationTimestamp="2025-12-04 17:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:56:23.174977728 +0000 UTC m=+1794.536052130" watchObservedRunningTime="2025-12-04 17:56:23.185011193 +0000 UTC m=+1794.546085595" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.459931 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.516682 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa21dba-653c-4cec-9ecd-09a6e1dfa082-operator-scripts\") pod \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\" (UID: \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.516776 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5cjp\" (UniqueName: \"kubernetes.io/projected/baa21dba-653c-4cec-9ecd-09a6e1dfa082-kube-api-access-k5cjp\") pod \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\" (UID: \"baa21dba-653c-4cec-9ecd-09a6e1dfa082\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.517690 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa21dba-653c-4cec-9ecd-09a6e1dfa082-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "baa21dba-653c-4cec-9ecd-09a6e1dfa082" (UID: "baa21dba-653c-4cec-9ecd-09a6e1dfa082"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.539395 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa21dba-653c-4cec-9ecd-09a6e1dfa082-kube-api-access-k5cjp" (OuterVolumeSpecName: "kube-api-access-k5cjp") pod "baa21dba-653c-4cec-9ecd-09a6e1dfa082" (UID: "baa21dba-653c-4cec-9ecd-09a6e1dfa082"). InnerVolumeSpecName "kube-api-access-k5cjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.622672 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/baa21dba-653c-4cec-9ecd-09a6e1dfa082-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.622737 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5cjp\" (UniqueName: \"kubernetes.io/projected/baa21dba-653c-4cec-9ecd-09a6e1dfa082-kube-api-access-k5cjp\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.709476 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.729476 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.732238 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.756829 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.828893 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdlkz\" (UniqueName: \"kubernetes.io/projected/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-kube-api-access-qdlkz\") pod \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\" (UID: \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.828956 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8td\" (UniqueName: \"kubernetes.io/projected/63125130-8f44-4d42-8fa9-2631c2c3d8ec-kube-api-access-fc8td\") pod \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\" (UID: \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.829029 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v86tq\" (UniqueName: \"kubernetes.io/projected/c2d49c2c-6474-4667-ba8c-21c2a24e4522-kube-api-access-v86tq\") pod \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\" (UID: \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.829100 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63125130-8f44-4d42-8fa9-2631c2c3d8ec-operator-scripts\") pod \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\" (UID: \"63125130-8f44-4d42-8fa9-2631c2c3d8ec\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.829162 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-operator-scripts\") pod \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\" (UID: \"505f7a05-8fe4-4e76-b5ed-45339ebda3dc\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.829181 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d49c2c-6474-4667-ba8c-21c2a24e4522-operator-scripts\") pod \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\" (UID: \"c2d49c2c-6474-4667-ba8c-21c2a24e4522\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.829625 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63125130-8f44-4d42-8fa9-2631c2c3d8ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63125130-8f44-4d42-8fa9-2631c2c3d8ec" (UID: "63125130-8f44-4d42-8fa9-2631c2c3d8ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.829880 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d49c2c-6474-4667-ba8c-21c2a24e4522-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2d49c2c-6474-4667-ba8c-21c2a24e4522" (UID: "c2d49c2c-6474-4667-ba8c-21c2a24e4522"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.829984 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "505f7a05-8fe4-4e76-b5ed-45339ebda3dc" (UID: "505f7a05-8fe4-4e76-b5ed-45339ebda3dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.832992 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d49c2c-6474-4667-ba8c-21c2a24e4522-kube-api-access-v86tq" (OuterVolumeSpecName: "kube-api-access-v86tq") pod "c2d49c2c-6474-4667-ba8c-21c2a24e4522" (UID: "c2d49c2c-6474-4667-ba8c-21c2a24e4522"). InnerVolumeSpecName "kube-api-access-v86tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.834080 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63125130-8f44-4d42-8fa9-2631c2c3d8ec-kube-api-access-fc8td" (OuterVolumeSpecName: "kube-api-access-fc8td") pod "63125130-8f44-4d42-8fa9-2631c2c3d8ec" (UID: "63125130-8f44-4d42-8fa9-2631c2c3d8ec"). InnerVolumeSpecName "kube-api-access-fc8td". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.835364 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-kube-api-access-qdlkz" (OuterVolumeSpecName: "kube-api-access-qdlkz") pod "505f7a05-8fe4-4e76-b5ed-45339ebda3dc" (UID: "505f7a05-8fe4-4e76-b5ed-45339ebda3dc"). InnerVolumeSpecName "kube-api-access-qdlkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.932324 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145f54f7-b50a-4d77-8152-5d8986faa646-operator-scripts\") pod \"145f54f7-b50a-4d77-8152-5d8986faa646\" (UID: \"145f54f7-b50a-4d77-8152-5d8986faa646\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.932566 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trn9h\" (UniqueName: \"kubernetes.io/projected/145f54f7-b50a-4d77-8152-5d8986faa646-kube-api-access-trn9h\") pod \"145f54f7-b50a-4d77-8152-5d8986faa646\" (UID: \"145f54f7-b50a-4d77-8152-5d8986faa646\") " Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.933388 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdlkz\" (UniqueName: \"kubernetes.io/projected/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-kube-api-access-qdlkz\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.933408 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8td\" (UniqueName: \"kubernetes.io/projected/63125130-8f44-4d42-8fa9-2631c2c3d8ec-kube-api-access-fc8td\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.933420 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v86tq\" (UniqueName: \"kubernetes.io/projected/c2d49c2c-6474-4667-ba8c-21c2a24e4522-kube-api-access-v86tq\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.933456 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63125130-8f44-4d42-8fa9-2631c2c3d8ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.933467 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505f7a05-8fe4-4e76-b5ed-45339ebda3dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.933477 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2d49c2c-6474-4667-ba8c-21c2a24e4522-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.940568 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/145f54f7-b50a-4d77-8152-5d8986faa646-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "145f54f7-b50a-4d77-8152-5d8986faa646" (UID: "145f54f7-b50a-4d77-8152-5d8986faa646"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:23 crc kubenswrapper[4948]: I1204 17:56:23.940727 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145f54f7-b50a-4d77-8152-5d8986faa646-kube-api-access-trn9h" (OuterVolumeSpecName: "kube-api-access-trn9h") pod "145f54f7-b50a-4d77-8152-5d8986faa646" (UID: "145f54f7-b50a-4d77-8152-5d8986faa646"). InnerVolumeSpecName "kube-api-access-trn9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.036487 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trn9h\" (UniqueName: \"kubernetes.io/projected/145f54f7-b50a-4d77-8152-5d8986faa646-kube-api-access-trn9h\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.036539 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/145f54f7-b50a-4d77-8152-5d8986faa646-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.147030 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6kbvb" event={"ID":"145f54f7-b50a-4d77-8152-5d8986faa646","Type":"ContainerDied","Data":"f1dcff9e831ba8c4b60a465997aebd219348069e1f5bd67c3eab4469ebee6c5d"} Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.147089 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1dcff9e831ba8c4b60a465997aebd219348069e1f5bd67c3eab4469ebee6c5d" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.147142 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6kbvb" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.153121 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-20a5-account-create-update-9x655" event={"ID":"c2d49c2c-6474-4667-ba8c-21c2a24e4522","Type":"ContainerDied","Data":"b29503e398fa820a32e88b205153e98e29bfafe5141a9a8663f9fb4e145a73ff"} Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.153211 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-20a5-account-create-update-9x655" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.154896 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tmvtp" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.156893 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hqqtd" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.158197 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b29503e398fa820a32e88b205153e98e29bfafe5141a9a8663f9fb4e145a73ff" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.158222 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tmvtp" event={"ID":"baa21dba-653c-4cec-9ecd-09a6e1dfa082","Type":"ContainerDied","Data":"1a4b68fb83e810176e0e42a8979437989a4961f12715182a3ad21e272a620b9b"} Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.158236 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a4b68fb83e810176e0e42a8979437989a4961f12715182a3ad21e272a620b9b" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.158246 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hqqtd" event={"ID":"505f7a05-8fe4-4e76-b5ed-45339ebda3dc","Type":"ContainerDied","Data":"c3f82125070c610bce9280ae40ce81d852574b68adcdc6071e939f00ccc44361"} Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.158257 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3f82125070c610bce9280ae40ce81d852574b68adcdc6071e939f00ccc44361" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.158838 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aee8-account-create-update-cllhj" Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.158972 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aee8-account-create-update-cllhj" event={"ID":"63125130-8f44-4d42-8fa9-2631c2c3d8ec","Type":"ContainerDied","Data":"8ef2afb98eb44c9d06ac460f838f69eb3f44569f9bf82034b90ef67f1f7c0e51"} Dec 04 17:56:24 crc kubenswrapper[4948]: I1204 17:56:24.159284 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ef2afb98eb44c9d06ac460f838f69eb3f44569f9bf82034b90ef67f1f7c0e51" Dec 04 17:56:26 crc kubenswrapper[4948]: I1204 17:56:26.793956 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:26 crc kubenswrapper[4948]: I1204 17:56:26.992035 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d77e21-3036-4810-98ec-1a44a3f882df-operator-scripts\") pod \"c7d77e21-3036-4810-98ec-1a44a3f882df\" (UID: \"c7d77e21-3036-4810-98ec-1a44a3f882df\") " Dec 04 17:56:26 crc kubenswrapper[4948]: I1204 17:56:26.992387 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hvqj\" (UniqueName: \"kubernetes.io/projected/c7d77e21-3036-4810-98ec-1a44a3f882df-kube-api-access-6hvqj\") pod \"c7d77e21-3036-4810-98ec-1a44a3f882df\" (UID: \"c7d77e21-3036-4810-98ec-1a44a3f882df\") " Dec 04 17:56:26 crc kubenswrapper[4948]: I1204 17:56:26.992470 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d77e21-3036-4810-98ec-1a44a3f882df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7d77e21-3036-4810-98ec-1a44a3f882df" (UID: "c7d77e21-3036-4810-98ec-1a44a3f882df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:26 crc kubenswrapper[4948]: I1204 17:56:26.992707 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d77e21-3036-4810-98ec-1a44a3f882df-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:26 crc kubenswrapper[4948]: I1204 17:56:26.996872 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d77e21-3036-4810-98ec-1a44a3f882df-kube-api-access-6hvqj" (OuterVolumeSpecName: "kube-api-access-6hvqj") pod "c7d77e21-3036-4810-98ec-1a44a3f882df" (UID: "c7d77e21-3036-4810-98ec-1a44a3f882df"). InnerVolumeSpecName "kube-api-access-6hvqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:27 crc kubenswrapper[4948]: I1204 17:56:27.093858 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hvqj\" (UniqueName: \"kubernetes.io/projected/c7d77e21-3036-4810-98ec-1a44a3f882df-kube-api-access-6hvqj\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:27 crc kubenswrapper[4948]: I1204 17:56:27.184427 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4bb1-account-create-update-lzs7x" event={"ID":"c7d77e21-3036-4810-98ec-1a44a3f882df","Type":"ContainerDied","Data":"988584cb95dc037238ea81d44d405cb1e16728428c276dc8ac4135a16ec6df1c"} Dec 04 17:56:27 crc kubenswrapper[4948]: I1204 17:56:27.184485 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="988584cb95dc037238ea81d44d405cb1e16728428c276dc8ac4135a16ec6df1c" Dec 04 17:56:27 crc kubenswrapper[4948]: I1204 17:56:27.184506 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4bb1-account-create-update-lzs7x" Dec 04 17:56:27 crc kubenswrapper[4948]: I1204 17:56:27.187681 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2v5d6" event={"ID":"ad839bcb-16b3-4321-8cf7-5e698ea7b32d","Type":"ContainerStarted","Data":"c02365b05662e5ea5bd8a6b35e5c77b94f4aa4dc2c47d4a74dd31d23b02905f2"} Dec 04 17:56:27 crc kubenswrapper[4948]: I1204 17:56:27.214980 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2v5d6" podStartSLOduration=1.95211358 podStartE2EDuration="7.214953614s" podCreationTimestamp="2025-12-04 17:56:20 +0000 UTC" firstStartedPulling="2025-12-04 17:56:21.555639929 +0000 UTC m=+1792.916714331" lastFinishedPulling="2025-12-04 17:56:26.818479973 +0000 UTC m=+1798.179554365" observedRunningTime="2025-12-04 17:56:27.204896218 +0000 UTC m=+1798.565970640" watchObservedRunningTime="2025-12-04 17:56:27.214953614 +0000 UTC m=+1798.576028016" Dec 04 17:56:28 crc kubenswrapper[4948]: I1204 17:56:28.202784 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jn67c" event={"ID":"55887774-d332-4083-8f3c-6281330114cd","Type":"ContainerStarted","Data":"09a87ed238333efb556cbdeb3f665192194c7e862fa0cb98f5d2e669778e36e4"} Dec 04 17:56:28 crc kubenswrapper[4948]: I1204 17:56:28.232381 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jn67c" podStartSLOduration=3.899430723 podStartE2EDuration="45.232340922s" podCreationTimestamp="2025-12-04 17:55:43 +0000 UTC" firstStartedPulling="2025-12-04 17:55:45.463423382 +0000 UTC m=+1756.824497784" lastFinishedPulling="2025-12-04 17:56:26.796333581 +0000 UTC m=+1798.157407983" observedRunningTime="2025-12-04 17:56:28.223886744 +0000 UTC m=+1799.584961156" watchObservedRunningTime="2025-12-04 17:56:28.232340922 +0000 UTC m=+1799.593415344" Dec 04 17:56:30 crc kubenswrapper[4948]: I1204 17:56:30.924266 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:56:30 crc kubenswrapper[4948]: I1204 17:56:30.990706 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rp5mc"] Dec 04 17:56:30 crc kubenswrapper[4948]: I1204 17:56:30.990986 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" podUID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" containerName="dnsmasq-dns" containerID="cri-o://92e0bc1e9eaccb74286935aa62862efcfa21a6d8d06ec096b9c55ab939a73593" gracePeriod=10 Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.227397 4948 generic.go:334] "Generic (PLEG): container finished" podID="ad839bcb-16b3-4321-8cf7-5e698ea7b32d" containerID="c02365b05662e5ea5bd8a6b35e5c77b94f4aa4dc2c47d4a74dd31d23b02905f2" exitCode=0 Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.227456 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2v5d6" event={"ID":"ad839bcb-16b3-4321-8cf7-5e698ea7b32d","Type":"ContainerDied","Data":"c02365b05662e5ea5bd8a6b35e5c77b94f4aa4dc2c47d4a74dd31d23b02905f2"} Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.229588 4948 generic.go:334] "Generic (PLEG): container finished" podID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" containerID="92e0bc1e9eaccb74286935aa62862efcfa21a6d8d06ec096b9c55ab939a73593" exitCode=0 Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.229666 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" event={"ID":"6e2de27d-b6aa-42a4-a11e-c9241d8b619d","Type":"ContainerDied","Data":"92e0bc1e9eaccb74286935aa62862efcfa21a6d8d06ec096b9c55ab939a73593"} Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.446461 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.573002 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-dns-svc\") pod \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.573091 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bxm9\" (UniqueName: \"kubernetes.io/projected/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-kube-api-access-5bxm9\") pod \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.573137 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-sb\") pod \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.573294 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-config\") pod \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.573329 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-nb\") pod \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\" (UID: \"6e2de27d-b6aa-42a4-a11e-c9241d8b619d\") " Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.577873 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-kube-api-access-5bxm9" (OuterVolumeSpecName: "kube-api-access-5bxm9") pod "6e2de27d-b6aa-42a4-a11e-c9241d8b619d" (UID: "6e2de27d-b6aa-42a4-a11e-c9241d8b619d"). InnerVolumeSpecName "kube-api-access-5bxm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.613819 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e2de27d-b6aa-42a4-a11e-c9241d8b619d" (UID: "6e2de27d-b6aa-42a4-a11e-c9241d8b619d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.618583 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e2de27d-b6aa-42a4-a11e-c9241d8b619d" (UID: "6e2de27d-b6aa-42a4-a11e-c9241d8b619d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.622739 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-config" (OuterVolumeSpecName: "config") pod "6e2de27d-b6aa-42a4-a11e-c9241d8b619d" (UID: "6e2de27d-b6aa-42a4-a11e-c9241d8b619d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.626580 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e2de27d-b6aa-42a4-a11e-c9241d8b619d" (UID: "6e2de27d-b6aa-42a4-a11e-c9241d8b619d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.674744 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.674775 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.674786 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.674798 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bxm9\" (UniqueName: \"kubernetes.io/projected/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-kube-api-access-5bxm9\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:31 crc kubenswrapper[4948]: I1204 17:56:31.674808 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2de27d-b6aa-42a4-a11e-c9241d8b619d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.239212 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" event={"ID":"6e2de27d-b6aa-42a4-a11e-c9241d8b619d","Type":"ContainerDied","Data":"9944f73f50fef25ae809ad5782ad4fd887c7f3fbef550db75b631d6c7bccfbae"} Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.239255 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-rp5mc" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.239545 4948 scope.go:117] "RemoveContainer" containerID="92e0bc1e9eaccb74286935aa62862efcfa21a6d8d06ec096b9c55ab939a73593" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.271319 4948 scope.go:117] "RemoveContainer" containerID="faf818866523b856be156999c88d5833f4148f5dd491ab22a6cad19520684a39" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.311641 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rp5mc"] Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.323066 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-rp5mc"] Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.558683 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.698138 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cndm\" (UniqueName: \"kubernetes.io/projected/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-kube-api-access-7cndm\") pod \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.698210 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-combined-ca-bundle\") pod \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.698345 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-config-data\") pod \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\" (UID: \"ad839bcb-16b3-4321-8cf7-5e698ea7b32d\") " Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.702220 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-kube-api-access-7cndm" (OuterVolumeSpecName: "kube-api-access-7cndm") pod "ad839bcb-16b3-4321-8cf7-5e698ea7b32d" (UID: "ad839bcb-16b3-4321-8cf7-5e698ea7b32d"). InnerVolumeSpecName "kube-api-access-7cndm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.735202 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad839bcb-16b3-4321-8cf7-5e698ea7b32d" (UID: "ad839bcb-16b3-4321-8cf7-5e698ea7b32d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.736939 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-config-data" (OuterVolumeSpecName: "config-data") pod "ad839bcb-16b3-4321-8cf7-5e698ea7b32d" (UID: "ad839bcb-16b3-4321-8cf7-5e698ea7b32d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.799965 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cndm\" (UniqueName: \"kubernetes.io/projected/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-kube-api-access-7cndm\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.800001 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.800010 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad839bcb-16b3-4321-8cf7-5e698ea7b32d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.914101 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:56:32 crc kubenswrapper[4948]: E1204 17:56:32.914445 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:56:32 crc kubenswrapper[4948]: I1204 17:56:32.925472 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" path="/var/lib/kubelet/pods/6e2de27d-b6aa-42a4-a11e-c9241d8b619d/volumes" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.249989 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2v5d6" event={"ID":"ad839bcb-16b3-4321-8cf7-5e698ea7b32d","Type":"ContainerDied","Data":"334d8028f5b0024a2f97481cc67c11640649f0116aff4cd6589854a906501e9e"} Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.250066 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334d8028f5b0024a2f97481cc67c11640649f0116aff4cd6589854a906501e9e" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.250090 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2v5d6" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.539960 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-jp7qw"] Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540339 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63125130-8f44-4d42-8fa9-2631c2c3d8ec" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540361 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="63125130-8f44-4d42-8fa9-2631c2c3d8ec" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540375 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad839bcb-16b3-4321-8cf7-5e698ea7b32d" containerName="keystone-db-sync" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540383 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad839bcb-16b3-4321-8cf7-5e698ea7b32d" containerName="keystone-db-sync" Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540397 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d49c2c-6474-4667-ba8c-21c2a24e4522" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540405 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d49c2c-6474-4667-ba8c-21c2a24e4522" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540424 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505f7a05-8fe4-4e76-b5ed-45339ebda3dc" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540430 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="505f7a05-8fe4-4e76-b5ed-45339ebda3dc" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540447 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d77e21-3036-4810-98ec-1a44a3f882df" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540454 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d77e21-3036-4810-98ec-1a44a3f882df" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540464 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" containerName="dnsmasq-dns" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540470 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" containerName="dnsmasq-dns" Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540481 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa21dba-653c-4cec-9ecd-09a6e1dfa082" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540488 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa21dba-653c-4cec-9ecd-09a6e1dfa082" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540502 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145f54f7-b50a-4d77-8152-5d8986faa646" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540509 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="145f54f7-b50a-4d77-8152-5d8986faa646" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: E1204 17:56:33.540520 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" containerName="init" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540526 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" containerName="init" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540728 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="145f54f7-b50a-4d77-8152-5d8986faa646" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540741 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d49c2c-6474-4667-ba8c-21c2a24e4522" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540755 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="505f7a05-8fe4-4e76-b5ed-45339ebda3dc" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540765 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="63125130-8f44-4d42-8fa9-2631c2c3d8ec" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540779 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa21dba-653c-4cec-9ecd-09a6e1dfa082" containerName="mariadb-database-create" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540791 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad839bcb-16b3-4321-8cf7-5e698ea7b32d" containerName="keystone-db-sync" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540802 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d77e21-3036-4810-98ec-1a44a3f882df" containerName="mariadb-account-create-update" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.540811 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2de27d-b6aa-42a4-a11e-c9241d8b619d" containerName="dnsmasq-dns" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.541854 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.557172 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hb2jf"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.558927 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.562495 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.562573 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.562619 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.562747 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ltmz5" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.562820 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.578770 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hb2jf"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.594389 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-jp7qw"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.614615 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-config\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.614671 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdwgz\" (UniqueName: \"kubernetes.io/projected/89732403-2037-41a9-84fd-5419342a46c2-kube-api-access-rdwgz\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.614719 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-combined-ca-bundle\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.614745 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-svc\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.614843 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.614865 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.614887 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-credential-keys\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.615145 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-config-data\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.615203 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.615308 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-scripts\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.615413 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-fernet-keys\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.615490 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vv5\" (UniqueName: \"kubernetes.io/projected/04f912df-523c-47c0-a306-d149ae78b924-kube-api-access-s5vv5\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.716957 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-config-data\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717009 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717129 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-scripts\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717154 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-fernet-keys\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717178 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vv5\" (UniqueName: \"kubernetes.io/projected/04f912df-523c-47c0-a306-d149ae78b924-kube-api-access-s5vv5\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717210 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-config\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717233 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwgz\" (UniqueName: \"kubernetes.io/projected/89732403-2037-41a9-84fd-5419342a46c2-kube-api-access-rdwgz\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717268 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-combined-ca-bundle\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717286 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-svc\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717312 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717328 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.717347 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-credential-keys\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.718080 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.718487 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-svc\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.718661 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.723003 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-config\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.724614 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-credential-keys\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.725221 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.725958 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.727108 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.728130 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-combined-ca-bundle\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.731319 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.731529 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.734672 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-fernet-keys\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.737372 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-config-data\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.741464 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-scripts\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.759341 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwgz\" (UniqueName: \"kubernetes.io/projected/89732403-2037-41a9-84fd-5419342a46c2-kube-api-access-rdwgz\") pod \"keystone-bootstrap-hb2jf\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.770684 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vv5\" (UniqueName: \"kubernetes.io/projected/04f912df-523c-47c0-a306-d149ae78b924-kube-api-access-s5vv5\") pod \"dnsmasq-dns-5b868669f-jp7qw\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.818327 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-run-httpd\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.818859 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-config-data\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.818964 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4zb8\" (UniqueName: \"kubernetes.io/projected/d845ad24-e30a-41e2-8a0b-6812b49b91d1-kube-api-access-z4zb8\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.819127 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.819243 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.819335 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-log-httpd\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.819480 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-scripts\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.828101 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nccrm"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.829163 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.833451 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2q66q" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.833624 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.833722 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.852033 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.861533 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.865371 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nccrm"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.883141 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.893394 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hpqvt"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.894663 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.904437 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.904643 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.904832 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rzl7p" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.952871 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hpqvt"] Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.956189 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-run-httpd\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.974572 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-config-data\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.975943 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4zb8\" (UniqueName: \"kubernetes.io/projected/d845ad24-e30a-41e2-8a0b-6812b49b91d1-kube-api-access-z4zb8\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.976156 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.976431 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.976660 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-log-httpd\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.977017 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-scripts\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.956866 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-run-httpd\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.985921 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-log-httpd\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.989944 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-scripts\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.999117 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:33 crc kubenswrapper[4948]: I1204 17:56:33.999255 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-jp7qw"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.003170 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.006132 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-config-data\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.014510 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4zb8\" (UniqueName: \"kubernetes.io/projected/d845ad24-e30a-41e2-8a0b-6812b49b91d1-kube-api-access-z4zb8\") pod \"ceilometer-0\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " pod="openstack/ceilometer-0" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.032145 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-xcq2k"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.033637 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.053380 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7zdlz"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.054670 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.059030 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xxfqv"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.059864 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.061282 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.061542 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rp6fj" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.071159 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.073509 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9vqcg" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.073770 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.078796 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8913b68d-4b7f-4a2e-b097-a60b0f557827-logs\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.078849 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkl2m\" (UniqueName: \"kubernetes.io/projected/2b81424a-68f9-40e6-bd32-a932a675578a-kube-api-access-kkl2m\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.078920 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.078953 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-config-data\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.078994 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079024 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-config\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079075 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-config\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079099 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xphmb\" (UniqueName: \"kubernetes.io/projected/8913b68d-4b7f-4a2e-b097-a60b0f557827-kube-api-access-xphmb\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079121 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcs8k\" (UniqueName: \"kubernetes.io/projected/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-kube-api-access-dcs8k\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079150 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b81424a-68f9-40e6-bd32-a932a675578a-etc-machine-id\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079191 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-combined-ca-bundle\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079214 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77k9v\" (UniqueName: \"kubernetes.io/projected/dfe507e1-c353-41eb-b746-44a5f4c78539-kube-api-access-77k9v\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079231 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-combined-ca-bundle\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079268 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-db-sync-config-data\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079296 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-scripts\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079312 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079332 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-combined-ca-bundle\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079347 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-scripts\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079368 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-svc\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.079423 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-config-data\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.122520 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7zdlz"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.154642 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-xcq2k"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.174898 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xxfqv"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182005 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-config-data\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182127 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8913b68d-4b7f-4a2e-b097-a60b0f557827-logs\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182219 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkl2m\" (UniqueName: \"kubernetes.io/projected/2b81424a-68f9-40e6-bd32-a932a675578a-kube-api-access-kkl2m\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182303 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-combined-ca-bundle\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182382 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182445 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-config-data\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182519 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182587 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-config\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182658 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-config\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182716 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xphmb\" (UniqueName: \"kubernetes.io/projected/8913b68d-4b7f-4a2e-b097-a60b0f557827-kube-api-access-xphmb\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182779 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcs8k\" (UniqueName: \"kubernetes.io/projected/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-kube-api-access-dcs8k\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182842 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b81424a-68f9-40e6-bd32-a932a675578a-etc-machine-id\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182921 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-combined-ca-bundle\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.182992 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzwj\" (UniqueName: \"kubernetes.io/projected/1bd09899-e64d-4b12-b604-dcd87d9c868b-kube-api-access-5fzwj\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183092 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77k9v\" (UniqueName: \"kubernetes.io/projected/dfe507e1-c353-41eb-b746-44a5f4c78539-kube-api-access-77k9v\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183169 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-combined-ca-bundle\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183235 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-db-sync-config-data\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183309 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-db-sync-config-data\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183378 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-scripts\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183442 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183505 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-combined-ca-bundle\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183563 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-scripts\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.183629 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-svc\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.184479 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-svc\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.191029 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.191773 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.192075 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b81424a-68f9-40e6-bd32-a932a675578a-etc-machine-id\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.198310 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8913b68d-4b7f-4a2e-b097-a60b0f557827-logs\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.201329 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-config\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.202163 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.202649 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.235371 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-db-sync-config-data\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.235516 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77k9v\" (UniqueName: \"kubernetes.io/projected/dfe507e1-c353-41eb-b746-44a5f4c78539-kube-api-access-77k9v\") pod \"dnsmasq-dns-cf78879c9-xcq2k\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.240717 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcs8k\" (UniqueName: \"kubernetes.io/projected/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-kube-api-access-dcs8k\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.242748 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-config-data\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.243083 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-config\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.249816 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-scripts\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.256397 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkl2m\" (UniqueName: \"kubernetes.io/projected/2b81424a-68f9-40e6-bd32-a932a675578a-kube-api-access-kkl2m\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.256655 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xphmb\" (UniqueName: \"kubernetes.io/projected/8913b68d-4b7f-4a2e-b097-a60b0f557827-kube-api-access-xphmb\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.259905 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-combined-ca-bundle\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.261596 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-scripts\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.268033 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-config-data\") pod \"placement-db-sync-7zdlz\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.268603 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-combined-ca-bundle\") pod \"neutron-db-sync-hpqvt\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.271060 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-combined-ca-bundle\") pod \"cinder-db-sync-nccrm\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.284706 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-combined-ca-bundle\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.284814 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzwj\" (UniqueName: \"kubernetes.io/projected/1bd09899-e64d-4b12-b604-dcd87d9c868b-kube-api-access-5fzwj\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.284869 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-db-sync-config-data\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.316963 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-db-sync-config-data\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.341745 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzwj\" (UniqueName: \"kubernetes.io/projected/1bd09899-e64d-4b12-b604-dcd87d9c868b-kube-api-access-5fzwj\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.342960 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-combined-ca-bundle\") pod \"barbican-db-sync-xxfqv\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.388585 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.409308 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7zdlz" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.498067 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.544731 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nccrm" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.560504 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.659684 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-jp7qw"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.787182 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hb2jf"] Dec 04 17:56:34 crc kubenswrapper[4948]: I1204 17:56:34.923925 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:56:35 crc kubenswrapper[4948]: W1204 17:56:35.138886 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd845ad24_e30a_41e2_8a0b_6812b49b91d1.slice/crio-2d5fbce719e33e8cfe366c025168b94386f18db8b8fcc08f9fdf893fee26302f WatchSource:0}: Error finding container 2d5fbce719e33e8cfe366c025168b94386f18db8b8fcc08f9fdf893fee26302f: Status 404 returned error can't find the container with id 2d5fbce719e33e8cfe366c025168b94386f18db8b8fcc08f9fdf893fee26302f Dec 04 17:56:35 crc kubenswrapper[4948]: I1204 17:56:35.337455 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-jp7qw" event={"ID":"04f912df-523c-47c0-a306-d149ae78b924","Type":"ContainerStarted","Data":"badc4729f7c7f60bbe73eee0120165030cea3adebb7750f874dc8be492d17631"} Dec 04 17:56:35 crc kubenswrapper[4948]: I1204 17:56:35.340923 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hb2jf" event={"ID":"89732403-2037-41a9-84fd-5419342a46c2","Type":"ContainerStarted","Data":"c62f3fda366c3cccc4c16c4d7ae57cc87dc70ab232412fb73ff4484e38e8b99d"} Dec 04 17:56:35 crc kubenswrapper[4948]: I1204 17:56:35.341833 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerStarted","Data":"2d5fbce719e33e8cfe366c025168b94386f18db8b8fcc08f9fdf893fee26302f"} Dec 04 17:56:35 crc kubenswrapper[4948]: I1204 17:56:35.779164 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-xcq2k"] Dec 04 17:56:35 crc kubenswrapper[4948]: I1204 17:56:35.861882 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7zdlz"] Dec 04 17:56:35 crc kubenswrapper[4948]: I1204 17:56:35.897556 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xxfqv"] Dec 04 17:56:35 crc kubenswrapper[4948]: W1204 17:56:35.899142 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8913b68d_4b7f_4a2e_b097_a60b0f557827.slice/crio-4e72858689350cb49327f8d3571ad203020ea4e4c59d5648551e8f8acdbcd85f WatchSource:0}: Error finding container 4e72858689350cb49327f8d3571ad203020ea4e4c59d5648551e8f8acdbcd85f: Status 404 returned error can't find the container with id 4e72858689350cb49327f8d3571ad203020ea4e4c59d5648551e8f8acdbcd85f Dec 04 17:56:35 crc kubenswrapper[4948]: W1204 17:56:35.903362 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bd09899_e64d_4b12_b604_dcd87d9c868b.slice/crio-3ae9e493d43b87480463e8f38ed6b12425463e49f0b4c6f6d7240ef507d98a02 WatchSource:0}: Error finding container 3ae9e493d43b87480463e8f38ed6b12425463e49f0b4c6f6d7240ef507d98a02: Status 404 returned error can't find the container with id 3ae9e493d43b87480463e8f38ed6b12425463e49f0b4c6f6d7240ef507d98a02 Dec 04 17:56:35 crc kubenswrapper[4948]: I1204 17:56:35.908015 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hpqvt"] Dec 04 17:56:35 crc kubenswrapper[4948]: I1204 17:56:35.963074 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.014376 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nccrm"] Dec 04 17:56:36 crc kubenswrapper[4948]: W1204 17:56:36.017329 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b81424a_68f9_40e6_bd32_a932a675578a.slice/crio-2e57988cfcced286f7ab9347d4eb01560da850e84a77ada315a49b4e465b093b WatchSource:0}: Error finding container 2e57988cfcced286f7ab9347d4eb01560da850e84a77ada315a49b4e465b093b: Status 404 returned error can't find the container with id 2e57988cfcced286f7ab9347d4eb01560da850e84a77ada315a49b4e465b093b Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.352403 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hb2jf" event={"ID":"89732403-2037-41a9-84fd-5419342a46c2","Type":"ContainerStarted","Data":"c8b321349d14a6fd9bd0edb8796b3c0b16d8a5faecb0046b5fad47aeb986e444"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.353723 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7zdlz" event={"ID":"8913b68d-4b7f-4a2e-b097-a60b0f557827","Type":"ContainerStarted","Data":"4e72858689350cb49327f8d3571ad203020ea4e4c59d5648551e8f8acdbcd85f"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.355173 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hpqvt" event={"ID":"7c2276fc-7b3c-4113-abb2-4e2558c9dc03","Type":"ContainerStarted","Data":"d9ee9b5e032d4779ab04e1d2c193283f985ed2c160939ade90340fbe03abc118"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.355307 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hpqvt" event={"ID":"7c2276fc-7b3c-4113-abb2-4e2558c9dc03","Type":"ContainerStarted","Data":"56bff914b54cfa17e57a986ae3d1acb0a4f9d6f9fa80d4eab95018d83f25ff7f"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.355999 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xxfqv" event={"ID":"1bd09899-e64d-4b12-b604-dcd87d9c868b","Type":"ContainerStarted","Data":"3ae9e493d43b87480463e8f38ed6b12425463e49f0b4c6f6d7240ef507d98a02"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.357543 4948 generic.go:334] "Generic (PLEG): container finished" podID="04f912df-523c-47c0-a306-d149ae78b924" containerID="e17d41a2f556ea6e90e8fc7fcb453514f9512081ff3517f5f190916639fd9f56" exitCode=0 Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.357609 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-jp7qw" event={"ID":"04f912df-523c-47c0-a306-d149ae78b924","Type":"ContainerDied","Data":"e17d41a2f556ea6e90e8fc7fcb453514f9512081ff3517f5f190916639fd9f56"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.358578 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nccrm" event={"ID":"2b81424a-68f9-40e6-bd32-a932a675578a","Type":"ContainerStarted","Data":"2e57988cfcced286f7ab9347d4eb01560da850e84a77ada315a49b4e465b093b"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.359589 4948 generic.go:334] "Generic (PLEG): container finished" podID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerID="77f8f388831571f420ea502428e1cccc7748d32cc2a97a4fa9a3499e667409d4" exitCode=0 Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.359614 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" event={"ID":"dfe507e1-c353-41eb-b746-44a5f4c78539","Type":"ContainerDied","Data":"77f8f388831571f420ea502428e1cccc7748d32cc2a97a4fa9a3499e667409d4"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.359630 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" event={"ID":"dfe507e1-c353-41eb-b746-44a5f4c78539","Type":"ContainerStarted","Data":"18d60f46c29ac4bf97080a0be4c3c3d3d389761e3e28e4e35e3b92e2cd922149"} Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.367305 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hb2jf" podStartSLOduration=3.3672808229999998 podStartE2EDuration="3.367280823s" podCreationTimestamp="2025-12-04 17:56:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:56:36.366351476 +0000 UTC m=+1807.727425878" watchObservedRunningTime="2025-12-04 17:56:36.367280823 +0000 UTC m=+1807.728355225" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.390683 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hpqvt" podStartSLOduration=3.390667572 podStartE2EDuration="3.390667572s" podCreationTimestamp="2025-12-04 17:56:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:56:36.378982188 +0000 UTC m=+1807.740056590" watchObservedRunningTime="2025-12-04 17:56:36.390667572 +0000 UTC m=+1807.751741974" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.739645 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.859441 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-swift-storage-0\") pod \"04f912df-523c-47c0-a306-d149ae78b924\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.859563 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-sb\") pod \"04f912df-523c-47c0-a306-d149ae78b924\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.859641 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vv5\" (UniqueName: \"kubernetes.io/projected/04f912df-523c-47c0-a306-d149ae78b924-kube-api-access-s5vv5\") pod \"04f912df-523c-47c0-a306-d149ae78b924\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.859694 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-config\") pod \"04f912df-523c-47c0-a306-d149ae78b924\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.859742 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-svc\") pod \"04f912df-523c-47c0-a306-d149ae78b924\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.859768 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-nb\") pod \"04f912df-523c-47c0-a306-d149ae78b924\" (UID: \"04f912df-523c-47c0-a306-d149ae78b924\") " Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.865214 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f912df-523c-47c0-a306-d149ae78b924-kube-api-access-s5vv5" (OuterVolumeSpecName: "kube-api-access-s5vv5") pod "04f912df-523c-47c0-a306-d149ae78b924" (UID: "04f912df-523c-47c0-a306-d149ae78b924"). InnerVolumeSpecName "kube-api-access-s5vv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.883470 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04f912df-523c-47c0-a306-d149ae78b924" (UID: "04f912df-523c-47c0-a306-d149ae78b924"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.890803 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04f912df-523c-47c0-a306-d149ae78b924" (UID: "04f912df-523c-47c0-a306-d149ae78b924"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.891118 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-config" (OuterVolumeSpecName: "config") pod "04f912df-523c-47c0-a306-d149ae78b924" (UID: "04f912df-523c-47c0-a306-d149ae78b924"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.893211 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04f912df-523c-47c0-a306-d149ae78b924" (UID: "04f912df-523c-47c0-a306-d149ae78b924"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.893211 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04f912df-523c-47c0-a306-d149ae78b924" (UID: "04f912df-523c-47c0-a306-d149ae78b924"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.961721 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5vv5\" (UniqueName: \"kubernetes.io/projected/04f912df-523c-47c0-a306-d149ae78b924-kube-api-access-s5vv5\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.961747 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.961757 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.961767 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.961775 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:36 crc kubenswrapper[4948]: I1204 17:56:36.961783 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f912df-523c-47c0-a306-d149ae78b924-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.377380 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-jp7qw" event={"ID":"04f912df-523c-47c0-a306-d149ae78b924","Type":"ContainerDied","Data":"badc4729f7c7f60bbe73eee0120165030cea3adebb7750f874dc8be492d17631"} Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.377627 4948 scope.go:117] "RemoveContainer" containerID="e17d41a2f556ea6e90e8fc7fcb453514f9512081ff3517f5f190916639fd9f56" Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.377728 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-jp7qw" Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.382577 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" event={"ID":"dfe507e1-c353-41eb-b746-44a5f4c78539","Type":"ContainerStarted","Data":"a763255c618b5532547e758c5f138f95331a46951817c6ef13b9bd81d5e4ed26"} Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.383462 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.384723 4948 generic.go:334] "Generic (PLEG): container finished" podID="55887774-d332-4083-8f3c-6281330114cd" containerID="09a87ed238333efb556cbdeb3f665192194c7e862fa0cb98f5d2e669778e36e4" exitCode=0 Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.385207 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jn67c" event={"ID":"55887774-d332-4083-8f3c-6281330114cd","Type":"ContainerDied","Data":"09a87ed238333efb556cbdeb3f665192194c7e862fa0cb98f5d2e669778e36e4"} Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.417476 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-jp7qw"] Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.424437 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-jp7qw"] Dec 04 17:56:37 crc kubenswrapper[4948]: I1204 17:56:37.447002 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" podStartSLOduration=4.446987027 podStartE2EDuration="4.446987027s" podCreationTimestamp="2025-12-04 17:56:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:56:37.442396882 +0000 UTC m=+1808.803471284" watchObservedRunningTime="2025-12-04 17:56:37.446987027 +0000 UTC m=+1808.808061429" Dec 04 17:56:38 crc kubenswrapper[4948]: I1204 17:56:38.938795 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f912df-523c-47c0-a306-d149ae78b924" path="/var/lib/kubelet/pods/04f912df-523c-47c0-a306-d149ae78b924/volumes" Dec 04 17:56:40 crc kubenswrapper[4948]: I1204 17:56:40.423454 4948 generic.go:334] "Generic (PLEG): container finished" podID="89732403-2037-41a9-84fd-5419342a46c2" containerID="c8b321349d14a6fd9bd0edb8796b3c0b16d8a5faecb0046b5fad47aeb986e444" exitCode=0 Dec 04 17:56:40 crc kubenswrapper[4948]: I1204 17:56:40.423679 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hb2jf" event={"ID":"89732403-2037-41a9-84fd-5419342a46c2","Type":"ContainerDied","Data":"c8b321349d14a6fd9bd0edb8796b3c0b16d8a5faecb0046b5fad47aeb986e444"} Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.185142 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jn67c" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.244714 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-config-data\") pod \"55887774-d332-4083-8f3c-6281330114cd\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.244768 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-db-sync-config-data\") pod \"55887774-d332-4083-8f3c-6281330114cd\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.250396 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "55887774-d332-4083-8f3c-6281330114cd" (UID: "55887774-d332-4083-8f3c-6281330114cd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.294970 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-config-data" (OuterVolumeSpecName: "config-data") pod "55887774-d332-4083-8f3c-6281330114cd" (UID: "55887774-d332-4083-8f3c-6281330114cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.346604 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-combined-ca-bundle\") pod \"55887774-d332-4083-8f3c-6281330114cd\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.346742 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl8fs\" (UniqueName: \"kubernetes.io/projected/55887774-d332-4083-8f3c-6281330114cd-kube-api-access-rl8fs\") pod \"55887774-d332-4083-8f3c-6281330114cd\" (UID: \"55887774-d332-4083-8f3c-6281330114cd\") " Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.347204 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.347221 4948 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.349452 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55887774-d332-4083-8f3c-6281330114cd-kube-api-access-rl8fs" (OuterVolumeSpecName: "kube-api-access-rl8fs") pod "55887774-d332-4083-8f3c-6281330114cd" (UID: "55887774-d332-4083-8f3c-6281330114cd"). InnerVolumeSpecName "kube-api-access-rl8fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.372249 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55887774-d332-4083-8f3c-6281330114cd" (UID: "55887774-d332-4083-8f3c-6281330114cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.435157 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jn67c" event={"ID":"55887774-d332-4083-8f3c-6281330114cd","Type":"ContainerDied","Data":"6eac8ea76d56c376c91b9c35ad20489e0fafdbd54a3cf57328312e571a643ba0"} Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.435192 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jn67c" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.435212 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eac8ea76d56c376c91b9c35ad20489e0fafdbd54a3cf57328312e571a643ba0" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.448545 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55887774-d332-4083-8f3c-6281330114cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:41 crc kubenswrapper[4948]: I1204 17:56:41.448577 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl8fs\" (UniqueName: \"kubernetes.io/projected/55887774-d332-4083-8f3c-6281330114cd-kube-api-access-rl8fs\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.704413 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-xcq2k"] Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.705020 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="dnsmasq-dns" containerID="cri-o://a763255c618b5532547e758c5f138f95331a46951817c6ef13b9bd81d5e4ed26" gracePeriod=10 Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.708213 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.768129 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xxthl"] Dec 04 17:56:42 crc kubenswrapper[4948]: E1204 17:56:42.768506 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f912df-523c-47c0-a306-d149ae78b924" containerName="init" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.768520 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f912df-523c-47c0-a306-d149ae78b924" containerName="init" Dec 04 17:56:42 crc kubenswrapper[4948]: E1204 17:56:42.768539 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55887774-d332-4083-8f3c-6281330114cd" containerName="glance-db-sync" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.768546 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="55887774-d332-4083-8f3c-6281330114cd" containerName="glance-db-sync" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.768722 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="55887774-d332-4083-8f3c-6281330114cd" containerName="glance-db-sync" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.768748 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f912df-523c-47c0-a306-d149ae78b924" containerName="init" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.769627 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.776188 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.776259 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.776298 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.776315 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-config\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.776352 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f49pb\" (UniqueName: \"kubernetes.io/projected/353c6df2-5698-49e5-969e-fac665a5e6e6-kube-api-access-f49pb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.776385 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.829497 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xxthl"] Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.878029 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.878091 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-config\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.878176 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f49pb\" (UniqueName: \"kubernetes.io/projected/353c6df2-5698-49e5-969e-fac665a5e6e6-kube-api-access-f49pb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.878237 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.878410 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.878464 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.880221 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.880745 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.882715 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-config\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.883574 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.884059 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:42 crc kubenswrapper[4948]: I1204 17:56:42.908255 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f49pb\" (UniqueName: \"kubernetes.io/projected/353c6df2-5698-49e5-969e-fac665a5e6e6-kube-api-access-f49pb\") pod \"dnsmasq-dns-56df8fb6b7-xxthl\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.101547 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.456285 4948 generic.go:334] "Generic (PLEG): container finished" podID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerID="a763255c618b5532547e758c5f138f95331a46951817c6ef13b9bd81d5e4ed26" exitCode=0 Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.456338 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" event={"ID":"dfe507e1-c353-41eb-b746-44a5f4c78539","Type":"ContainerDied","Data":"a763255c618b5532547e758c5f138f95331a46951817c6ef13b9bd81d5e4ed26"} Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.581931 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.583552 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.585964 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.586141 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.586993 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bdk4p" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.595993 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.694753 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.694842 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-logs\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.694900 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.694936 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.694965 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.694997 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.695020 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xtdh\" (UniqueName: \"kubernetes.io/projected/6ba827a1-34af-4a65-8b05-18d80df57324-kube-api-access-9xtdh\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.797152 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-logs\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.797235 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.797285 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.797313 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.797342 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.797366 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xtdh\" (UniqueName: \"kubernetes.io/projected/6ba827a1-34af-4a65-8b05-18d80df57324-kube-api-access-9xtdh\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.797456 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.797842 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.798340 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-logs\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.798543 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.821621 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.835639 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.842487 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xtdh\" (UniqueName: \"kubernetes.io/projected/6ba827a1-34af-4a65-8b05-18d80df57324-kube-api-access-9xtdh\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.846575 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.871230 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " pod="openstack/glance-default-external-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.876234 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.881837 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.884681 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.892815 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:56:43 crc kubenswrapper[4948]: I1204 17:56:43.905205 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.009577 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.009661 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.009701 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.011791 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cp59\" (UniqueName: \"kubernetes.io/projected/a563ce20-3f9a-4fed-972b-bf00209315b2-kube-api-access-6cp59\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.011817 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.011877 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.011928 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.113383 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.113507 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.113598 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.113629 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.113694 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cp59\" (UniqueName: \"kubernetes.io/projected/a563ce20-3f9a-4fed-972b-bf00209315b2-kube-api-access-6cp59\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.113721 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.113758 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.114374 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-logs\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.114822 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.115256 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.122944 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.124081 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.124239 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.133400 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cp59\" (UniqueName: \"kubernetes.io/projected/a563ce20-3f9a-4fed-972b-bf00209315b2-kube-api-access-6cp59\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.150554 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.252363 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.391764 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Dec 04 17:56:44 crc kubenswrapper[4948]: I1204 17:56:44.950975 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:56:45 crc kubenswrapper[4948]: I1204 17:56:45.010538 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:56:46 crc kubenswrapper[4948]: I1204 17:56:46.914686 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:56:46 crc kubenswrapper[4948]: E1204 17:56:46.915187 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:56:48 crc kubenswrapper[4948]: E1204 17:56:48.796987 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Dec 04 17:56:48 crc kubenswrapper[4948]: E1204 17:56:48.797536 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xphmb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-7zdlz_openstack(8913b68d-4b7f-4a2e-b097-a60b0f557827): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:56:48 crc kubenswrapper[4948]: E1204 17:56:48.799073 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-7zdlz" podUID="8913b68d-4b7f-4a2e-b097-a60b0f557827" Dec 04 17:56:48 crc kubenswrapper[4948]: I1204 17:56:48.927798 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.005984 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-combined-ca-bundle\") pod \"89732403-2037-41a9-84fd-5419342a46c2\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.006069 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-credential-keys\") pod \"89732403-2037-41a9-84fd-5419342a46c2\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.006141 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-scripts\") pod \"89732403-2037-41a9-84fd-5419342a46c2\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.006215 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdwgz\" (UniqueName: \"kubernetes.io/projected/89732403-2037-41a9-84fd-5419342a46c2-kube-api-access-rdwgz\") pod \"89732403-2037-41a9-84fd-5419342a46c2\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.006312 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-fernet-keys\") pod \"89732403-2037-41a9-84fd-5419342a46c2\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.006377 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-config-data\") pod \"89732403-2037-41a9-84fd-5419342a46c2\" (UID: \"89732403-2037-41a9-84fd-5419342a46c2\") " Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.014396 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "89732403-2037-41a9-84fd-5419342a46c2" (UID: "89732403-2037-41a9-84fd-5419342a46c2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.014438 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-scripts" (OuterVolumeSpecName: "scripts") pod "89732403-2037-41a9-84fd-5419342a46c2" (UID: "89732403-2037-41a9-84fd-5419342a46c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.023708 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89732403-2037-41a9-84fd-5419342a46c2-kube-api-access-rdwgz" (OuterVolumeSpecName: "kube-api-access-rdwgz") pod "89732403-2037-41a9-84fd-5419342a46c2" (UID: "89732403-2037-41a9-84fd-5419342a46c2"). InnerVolumeSpecName "kube-api-access-rdwgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.035998 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "89732403-2037-41a9-84fd-5419342a46c2" (UID: "89732403-2037-41a9-84fd-5419342a46c2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.037008 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-config-data" (OuterVolumeSpecName: "config-data") pod "89732403-2037-41a9-84fd-5419342a46c2" (UID: "89732403-2037-41a9-84fd-5419342a46c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.053962 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89732403-2037-41a9-84fd-5419342a46c2" (UID: "89732403-2037-41a9-84fd-5419342a46c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.112317 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdwgz\" (UniqueName: \"kubernetes.io/projected/89732403-2037-41a9-84fd-5419342a46c2-kube-api-access-rdwgz\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.112355 4948 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.112367 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.112379 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.112392 4948 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.112403 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89732403-2037-41a9-84fd-5419342a46c2-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.507944 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hb2jf" Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.508030 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hb2jf" event={"ID":"89732403-2037-41a9-84fd-5419342a46c2","Type":"ContainerDied","Data":"c62f3fda366c3cccc4c16c4d7ae57cc87dc70ab232412fb73ff4484e38e8b99d"} Dec 04 17:56:49 crc kubenswrapper[4948]: I1204 17:56:49.508097 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c62f3fda366c3cccc4c16c4d7ae57cc87dc70ab232412fb73ff4484e38e8b99d" Dec 04 17:56:49 crc kubenswrapper[4948]: E1204 17:56:49.509453 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-7zdlz" podUID="8913b68d-4b7f-4a2e-b097-a60b0f557827" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.007298 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hb2jf"] Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.015740 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hb2jf"] Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.112539 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rjcpf"] Dec 04 17:56:50 crc kubenswrapper[4948]: E1204 17:56:50.112975 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89732403-2037-41a9-84fd-5419342a46c2" containerName="keystone-bootstrap" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.112994 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="89732403-2037-41a9-84fd-5419342a46c2" containerName="keystone-bootstrap" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.113213 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="89732403-2037-41a9-84fd-5419342a46c2" containerName="keystone-bootstrap" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.113924 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.115875 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.116466 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.116623 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ltmz5" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.117565 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.118229 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.166636 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rjcpf"] Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.240452 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-credential-keys\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.240509 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-fernet-keys\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.240532 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-combined-ca-bundle\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.240759 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-config-data\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.240864 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtff5\" (UniqueName: \"kubernetes.io/projected/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-kube-api-access-dtff5\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.240913 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-scripts\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.343128 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-credential-keys\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.343177 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-fernet-keys\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.343202 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-combined-ca-bundle\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.343259 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-config-data\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.343294 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtff5\" (UniqueName: \"kubernetes.io/projected/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-kube-api-access-dtff5\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.343319 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-scripts\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.348557 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-scripts\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.350328 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-credential-keys\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.358500 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-config-data\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.360466 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-fernet-keys\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.361149 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-combined-ca-bundle\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.361609 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtff5\" (UniqueName: \"kubernetes.io/projected/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-kube-api-access-dtff5\") pod \"keystone-bootstrap-rjcpf\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.454248 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:56:50 crc kubenswrapper[4948]: I1204 17:56:50.926246 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89732403-2037-41a9-84fd-5419342a46c2" path="/var/lib/kubelet/pods/89732403-2037-41a9-84fd-5419342a46c2/volumes" Dec 04 17:56:54 crc kubenswrapper[4948]: I1204 17:56:54.391276 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: i/o timeout" Dec 04 17:56:56 crc kubenswrapper[4948]: I1204 17:56:56.954255 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.059564 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-config\") pod \"dfe507e1-c353-41eb-b746-44a5f4c78539\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.059607 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-swift-storage-0\") pod \"dfe507e1-c353-41eb-b746-44a5f4c78539\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.059688 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77k9v\" (UniqueName: \"kubernetes.io/projected/dfe507e1-c353-41eb-b746-44a5f4c78539-kube-api-access-77k9v\") pod \"dfe507e1-c353-41eb-b746-44a5f4c78539\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.059772 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-svc\") pod \"dfe507e1-c353-41eb-b746-44a5f4c78539\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.059831 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-nb\") pod \"dfe507e1-c353-41eb-b746-44a5f4c78539\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.059853 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-sb\") pod \"dfe507e1-c353-41eb-b746-44a5f4c78539\" (UID: \"dfe507e1-c353-41eb-b746-44a5f4c78539\") " Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.087374 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe507e1-c353-41eb-b746-44a5f4c78539-kube-api-access-77k9v" (OuterVolumeSpecName: "kube-api-access-77k9v") pod "dfe507e1-c353-41eb-b746-44a5f4c78539" (UID: "dfe507e1-c353-41eb-b746-44a5f4c78539"). InnerVolumeSpecName "kube-api-access-77k9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.096632 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfe507e1-c353-41eb-b746-44a5f4c78539" (UID: "dfe507e1-c353-41eb-b746-44a5f4c78539"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.109592 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dfe507e1-c353-41eb-b746-44a5f4c78539" (UID: "dfe507e1-c353-41eb-b746-44a5f4c78539"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.110603 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfe507e1-c353-41eb-b746-44a5f4c78539" (UID: "dfe507e1-c353-41eb-b746-44a5f4c78539"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.122208 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-config" (OuterVolumeSpecName: "config") pod "dfe507e1-c353-41eb-b746-44a5f4c78539" (UID: "dfe507e1-c353-41eb-b746-44a5f4c78539"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.127428 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dfe507e1-c353-41eb-b746-44a5f4c78539" (UID: "dfe507e1-c353-41eb-b746-44a5f4c78539"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.162304 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.162508 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.162569 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.162630 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.162761 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfe507e1-c353-41eb-b746-44a5f4c78539-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.162820 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77k9v\" (UniqueName: \"kubernetes.io/projected/dfe507e1-c353-41eb-b746-44a5f4c78539-kube-api-access-77k9v\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.581843 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" event={"ID":"dfe507e1-c353-41eb-b746-44a5f4c78539","Type":"ContainerDied","Data":"18d60f46c29ac4bf97080a0be4c3c3d3d389761e3e28e4e35e3b92e2cd922149"} Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.581907 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.581913 4948 scope.go:117] "RemoveContainer" containerID="a763255c618b5532547e758c5f138f95331a46951817c6ef13b9bd81d5e4ed26" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.586198 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hpqvt" event={"ID":"7c2276fc-7b3c-4113-abb2-4e2558c9dc03","Type":"ContainerDied","Data":"d9ee9b5e032d4779ab04e1d2c193283f985ed2c160939ade90340fbe03abc118"} Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.586359 4948 generic.go:334] "Generic (PLEG): container finished" podID="7c2276fc-7b3c-4113-abb2-4e2558c9dc03" containerID="d9ee9b5e032d4779ab04e1d2c193283f985ed2c160939ade90340fbe03abc118" exitCode=0 Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.623978 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-xcq2k"] Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.639394 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-xcq2k"] Dec 04 17:56:57 crc kubenswrapper[4948]: E1204 17:56:57.698736 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfe507e1_c353_41eb_b746_44a5f4c78539.slice\": RecentStats: unable to find data in memory cache]" Dec 04 17:56:57 crc kubenswrapper[4948]: E1204 17:56:57.980278 4948 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 04 17:56:57 crc kubenswrapper[4948]: E1204 17:56:57.980779 4948 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kkl2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nccrm_openstack(2b81424a-68f9-40e6-bd32-a932a675578a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 17:56:57 crc kubenswrapper[4948]: E1204 17:56:57.982892 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nccrm" podUID="2b81424a-68f9-40e6-bd32-a932a675578a" Dec 04 17:56:57 crc kubenswrapper[4948]: I1204 17:56:57.991715 4948 scope.go:117] "RemoveContainer" containerID="77f8f388831571f420ea502428e1cccc7748d32cc2a97a4fa9a3499e667409d4" Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.488510 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xxthl"] Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.553815 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.607302 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rjcpf"] Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.609487 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xxfqv" event={"ID":"1bd09899-e64d-4b12-b604-dcd87d9c868b","Type":"ContainerStarted","Data":"ff7daf90f4531c9b225c2a333f522204056bff71a75157eece31bc57ae7f99af"} Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.612699 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a563ce20-3f9a-4fed-972b-bf00209315b2","Type":"ContainerStarted","Data":"721a9e2a3a3fe0720872365680ab265e665f3e538e95c40a95e1cdaec1925512"} Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.613707 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" event={"ID":"353c6df2-5698-49e5-969e-fac665a5e6e6","Type":"ContainerStarted","Data":"e902180dda923a473c328c1c621898ba4186a9366722ff204e48dcf65373de9c"} Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.617716 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerStarted","Data":"5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db"} Dec 04 17:56:58 crc kubenswrapper[4948]: E1204 17:56:58.619390 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-nccrm" podUID="2b81424a-68f9-40e6-bd32-a932a675578a" Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.625776 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xxfqv" podStartSLOduration=3.58321241 podStartE2EDuration="25.625763293s" podCreationTimestamp="2025-12-04 17:56:33 +0000 UTC" firstStartedPulling="2025-12-04 17:56:35.911794765 +0000 UTC m=+1807.272869167" lastFinishedPulling="2025-12-04 17:56:57.954345648 +0000 UTC m=+1829.315420050" observedRunningTime="2025-12-04 17:56:58.625370811 +0000 UTC m=+1829.986445213" watchObservedRunningTime="2025-12-04 17:56:58.625763293 +0000 UTC m=+1829.986837695" Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.675187 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:56:58 crc kubenswrapper[4948]: W1204 17:56:58.690155 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba827a1_34af_4a65_8b05_18d80df57324.slice/crio-d1c6b4551a70181d4e6378086d8cdbb40dc648a5a4cf07f5594c7b43309927dc WatchSource:0}: Error finding container d1c6b4551a70181d4e6378086d8cdbb40dc648a5a4cf07f5594c7b43309927dc: Status 404 returned error can't find the container with id d1c6b4551a70181d4e6378086d8cdbb40dc648a5a4cf07f5594c7b43309927dc Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.906527 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.942865 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" path="/var/lib/kubelet/pods/dfe507e1-c353-41eb-b746-44a5f4c78539/volumes" Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.991307 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-combined-ca-bundle\") pod \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.991442 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcs8k\" (UniqueName: \"kubernetes.io/projected/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-kube-api-access-dcs8k\") pod \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " Dec 04 17:56:58 crc kubenswrapper[4948]: I1204 17:56:58.991472 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-config\") pod \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\" (UID: \"7c2276fc-7b3c-4113-abb2-4e2558c9dc03\") " Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.007015 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-kube-api-access-dcs8k" (OuterVolumeSpecName: "kube-api-access-dcs8k") pod "7c2276fc-7b3c-4113-abb2-4e2558c9dc03" (UID: "7c2276fc-7b3c-4113-abb2-4e2558c9dc03"). InnerVolumeSpecName "kube-api-access-dcs8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.026800 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-config" (OuterVolumeSpecName: "config") pod "7c2276fc-7b3c-4113-abb2-4e2558c9dc03" (UID: "7c2276fc-7b3c-4113-abb2-4e2558c9dc03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.028858 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c2276fc-7b3c-4113-abb2-4e2558c9dc03" (UID: "7c2276fc-7b3c-4113-abb2-4e2558c9dc03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.093578 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcs8k\" (UniqueName: \"kubernetes.io/projected/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-kube-api-access-dcs8k\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.093607 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.093618 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c2276fc-7b3c-4113-abb2-4e2558c9dc03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.394250 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-xcq2k" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: i/o timeout" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.656033 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hpqvt" event={"ID":"7c2276fc-7b3c-4113-abb2-4e2558c9dc03","Type":"ContainerDied","Data":"56bff914b54cfa17e57a986ae3d1acb0a4f9d6f9fa80d4eab95018d83f25ff7f"} Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.656109 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56bff914b54cfa17e57a986ae3d1acb0a4f9d6f9fa80d4eab95018d83f25ff7f" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.656083 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hpqvt" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.657621 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rjcpf" event={"ID":"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a","Type":"ContainerStarted","Data":"f64cbf7b6e43ed96dc6f100a33115d3b1f5b0f8fe82df36ecfd2f69eebd3aea8"} Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.657739 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rjcpf" event={"ID":"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a","Type":"ContainerStarted","Data":"73e4851cd1c68069fef829f85b47587493def1fff470392cf31d1ce17eae120a"} Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.659210 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a563ce20-3f9a-4fed-972b-bf00209315b2","Type":"ContainerStarted","Data":"1086202a8a9df8e39256b991874791e58186abcfa2a20f4082f0d19c8e76202a"} Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.666423 4948 generic.go:334] "Generic (PLEG): container finished" podID="353c6df2-5698-49e5-969e-fac665a5e6e6" containerID="de8a6cfd06eb0b3173ebbf580760c2fec9712c5e5fc56756ae85756a77eb34fb" exitCode=0 Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.666520 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" event={"ID":"353c6df2-5698-49e5-969e-fac665a5e6e6","Type":"ContainerDied","Data":"de8a6cfd06eb0b3173ebbf580760c2fec9712c5e5fc56756ae85756a77eb34fb"} Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.686018 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rjcpf" podStartSLOduration=9.685997172 podStartE2EDuration="9.685997172s" podCreationTimestamp="2025-12-04 17:56:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:56:59.674487734 +0000 UTC m=+1831.035562136" watchObservedRunningTime="2025-12-04 17:56:59.685997172 +0000 UTC m=+1831.047071574" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.686148 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ba827a1-34af-4a65-8b05-18d80df57324","Type":"ContainerStarted","Data":"0d9bd5f05df11b25ce7671a425cd507c88b9ea45659b48b77b88a070061bf1ba"} Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.686200 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ba827a1-34af-4a65-8b05-18d80df57324","Type":"ContainerStarted","Data":"d1c6b4551a70181d4e6378086d8cdbb40dc648a5a4cf07f5594c7b43309927dc"} Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.854389 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xxthl"] Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.884651 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-n5kht"] Dec 04 17:56:59 crc kubenswrapper[4948]: E1204 17:56:59.885213 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="init" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.885230 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="init" Dec 04 17:56:59 crc kubenswrapper[4948]: E1204 17:56:59.885252 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="dnsmasq-dns" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.885259 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="dnsmasq-dns" Dec 04 17:56:59 crc kubenswrapper[4948]: E1204 17:56:59.885305 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2276fc-7b3c-4113-abb2-4e2558c9dc03" containerName="neutron-db-sync" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.885315 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2276fc-7b3c-4113-abb2-4e2558c9dc03" containerName="neutron-db-sync" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.885583 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c2276fc-7b3c-4113-abb2-4e2558c9dc03" containerName="neutron-db-sync" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.885635 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe507e1-c353-41eb-b746-44a5f4c78539" containerName="dnsmasq-dns" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.887697 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:56:59 crc kubenswrapper[4948]: I1204 17:56:59.895075 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-n5kht"] Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.024774 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-svc\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.024822 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.024845 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-config\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.024859 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.024905 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzd8\" (UniqueName: \"kubernetes.io/projected/a7701ffc-1871-4c71-8048-23b425f47dec-kube-api-access-nnzd8\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.024952 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.069244 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-759ffd8674-fjwkq"] Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.070512 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.077125 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-759ffd8674-fjwkq"] Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.079841 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.080104 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rzl7p" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.080254 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.080403 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154026 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzd8\" (UniqueName: \"kubernetes.io/projected/a7701ffc-1871-4c71-8048-23b425f47dec-kube-api-access-nnzd8\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154111 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-httpd-config\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154136 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154185 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-ovndb-tls-certs\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154217 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-config\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154246 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhwrk\" (UniqueName: \"kubernetes.io/projected/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-kube-api-access-vhwrk\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154271 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-svc\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154292 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154312 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-config\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154327 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-combined-ca-bundle\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.154342 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.155376 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.155630 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.155900 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-svc\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.156369 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-config\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.156433 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.179894 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzd8\" (UniqueName: \"kubernetes.io/projected/a7701ffc-1871-4c71-8048-23b425f47dec-kube-api-access-nnzd8\") pod \"dnsmasq-dns-6b7b667979-n5kht\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: E1204 17:57:00.217984 4948 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 04 17:57:00 crc kubenswrapper[4948]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/353c6df2-5698-49e5-969e-fac665a5e6e6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 04 17:57:00 crc kubenswrapper[4948]: > podSandboxID="e902180dda923a473c328c1c621898ba4186a9366722ff204e48dcf65373de9c" Dec 04 17:57:00 crc kubenswrapper[4948]: E1204 17:57:00.218455 4948 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 04 17:57:00 crc kubenswrapper[4948]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dbh55fhfh596h68dh54dh55ch68ch5c7h5f8h68ch58fh644hb8h65bhc5h5d9h5bdhdh577h5dbh64dh694h687h667hcdh664h695h6fh579h95h565q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f49pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-56df8fb6b7-xxthl_openstack(353c6df2-5698-49e5-969e-fac665a5e6e6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/353c6df2-5698-49e5-969e-fac665a5e6e6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 04 17:57:00 crc kubenswrapper[4948]: > logger="UnhandledError" Dec 04 17:57:00 crc kubenswrapper[4948]: E1204 17:57:00.219694 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/353c6df2-5698-49e5-969e-fac665a5e6e6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" podUID="353c6df2-5698-49e5-969e-fac665a5e6e6" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.257730 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-config\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.257821 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhwrk\" (UniqueName: \"kubernetes.io/projected/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-kube-api-access-vhwrk\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.257914 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-combined-ca-bundle\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.258100 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-httpd-config\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.258251 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-ovndb-tls-certs\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.275732 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-combined-ca-bundle\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.324510 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-httpd-config\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.324803 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-ovndb-tls-certs\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.325631 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-config\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.335493 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhwrk\" (UniqueName: \"kubernetes.io/projected/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-kube-api-access-vhwrk\") pod \"neutron-759ffd8674-fjwkq\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.369324 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.376430 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.700810 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerStarted","Data":"a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af"} Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.711094 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a563ce20-3f9a-4fed-972b-bf00209315b2","Type":"ContainerStarted","Data":"64d3bd30618560134ad1780ad81211017de93b4ca3a2cbabc9ae78e1066bbd6b"} Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.711139 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerName="glance-log" containerID="cri-o://1086202a8a9df8e39256b991874791e58186abcfa2a20f4082f0d19c8e76202a" gracePeriod=30 Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.711205 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerName="glance-httpd" containerID="cri-o://64d3bd30618560134ad1780ad81211017de93b4ca3a2cbabc9ae78e1066bbd6b" gracePeriod=30 Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.746313 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=18.746296985 podStartE2EDuration="18.746296985s" podCreationTimestamp="2025-12-04 17:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:00.741102782 +0000 UTC m=+1832.102177184" watchObservedRunningTime="2025-12-04 17:57:00.746296985 +0000 UTC m=+1832.107371407" Dec 04 17:57:00 crc kubenswrapper[4948]: I1204 17:57:00.962077 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-n5kht"] Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.184261 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-759ffd8674-fjwkq"] Dec 04 17:57:01 crc kubenswrapper[4948]: W1204 17:57:01.188411 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66ea68ed_1808_416c_b0e8_a2682d3d3b1f.slice/crio-ac45923e9c7627031a4f5ad1d74f087ca83e61cb8fccd12c17c502a87d1b1fe4 WatchSource:0}: Error finding container ac45923e9c7627031a4f5ad1d74f087ca83e61cb8fccd12c17c502a87d1b1fe4: Status 404 returned error can't find the container with id ac45923e9c7627031a4f5ad1d74f087ca83e61cb8fccd12c17c502a87d1b1fe4 Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.282614 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.401072 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-config\") pod \"353c6df2-5698-49e5-969e-fac665a5e6e6\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.401130 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-sb\") pod \"353c6df2-5698-49e5-969e-fac665a5e6e6\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.401171 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-svc\") pod \"353c6df2-5698-49e5-969e-fac665a5e6e6\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.401252 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-nb\") pod \"353c6df2-5698-49e5-969e-fac665a5e6e6\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.401278 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-swift-storage-0\") pod \"353c6df2-5698-49e5-969e-fac665a5e6e6\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.401302 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f49pb\" (UniqueName: \"kubernetes.io/projected/353c6df2-5698-49e5-969e-fac665a5e6e6-kube-api-access-f49pb\") pod \"353c6df2-5698-49e5-969e-fac665a5e6e6\" (UID: \"353c6df2-5698-49e5-969e-fac665a5e6e6\") " Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.416287 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353c6df2-5698-49e5-969e-fac665a5e6e6-kube-api-access-f49pb" (OuterVolumeSpecName: "kube-api-access-f49pb") pod "353c6df2-5698-49e5-969e-fac665a5e6e6" (UID: "353c6df2-5698-49e5-969e-fac665a5e6e6"). InnerVolumeSpecName "kube-api-access-f49pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.463010 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "353c6df2-5698-49e5-969e-fac665a5e6e6" (UID: "353c6df2-5698-49e5-969e-fac665a5e6e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.463201 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-config" (OuterVolumeSpecName: "config") pod "353c6df2-5698-49e5-969e-fac665a5e6e6" (UID: "353c6df2-5698-49e5-969e-fac665a5e6e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.463358 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "353c6df2-5698-49e5-969e-fac665a5e6e6" (UID: "353c6df2-5698-49e5-969e-fac665a5e6e6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.469171 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "353c6df2-5698-49e5-969e-fac665a5e6e6" (UID: "353c6df2-5698-49e5-969e-fac665a5e6e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.489159 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "353c6df2-5698-49e5-969e-fac665a5e6e6" (UID: "353c6df2-5698-49e5-969e-fac665a5e6e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.502734 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.502767 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.502779 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.502787 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.502795 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/353c6df2-5698-49e5-969e-fac665a5e6e6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.502804 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f49pb\" (UniqueName: \"kubernetes.io/projected/353c6df2-5698-49e5-969e-fac665a5e6e6-kube-api-access-f49pb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.725129 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" event={"ID":"a7701ffc-1871-4c71-8048-23b425f47dec","Type":"ContainerStarted","Data":"3202098028b2aec5df8b1d35b1597ce08799d0d96e1b1541bf499b9b21f0b1be"} Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.732772 4948 generic.go:334] "Generic (PLEG): container finished" podID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerID="64d3bd30618560134ad1780ad81211017de93b4ca3a2cbabc9ae78e1066bbd6b" exitCode=143 Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.732815 4948 generic.go:334] "Generic (PLEG): container finished" podID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerID="1086202a8a9df8e39256b991874791e58186abcfa2a20f4082f0d19c8e76202a" exitCode=143 Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.732872 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a563ce20-3f9a-4fed-972b-bf00209315b2","Type":"ContainerDied","Data":"64d3bd30618560134ad1780ad81211017de93b4ca3a2cbabc9ae78e1066bbd6b"} Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.732921 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a563ce20-3f9a-4fed-972b-bf00209315b2","Type":"ContainerDied","Data":"1086202a8a9df8e39256b991874791e58186abcfa2a20f4082f0d19c8e76202a"} Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.734791 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" event={"ID":"353c6df2-5698-49e5-969e-fac665a5e6e6","Type":"ContainerDied","Data":"e902180dda923a473c328c1c621898ba4186a9366722ff204e48dcf65373de9c"} Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.734834 4948 scope.go:117] "RemoveContainer" containerID="de8a6cfd06eb0b3173ebbf580760c2fec9712c5e5fc56756ae85756a77eb34fb" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.734955 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xxthl" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.748687 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ba827a1-34af-4a65-8b05-18d80df57324","Type":"ContainerStarted","Data":"4e80c18e8fadf5646e4c1128551c4bd8744b986403bb55e6f92e5a0338335034"} Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.748856 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" containerName="glance-log" containerID="cri-o://0d9bd5f05df11b25ce7671a425cd507c88b9ea45659b48b77b88a070061bf1ba" gracePeriod=30 Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.749486 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" containerName="glance-httpd" containerID="cri-o://4e80c18e8fadf5646e4c1128551c4bd8744b986403bb55e6f92e5a0338335034" gracePeriod=30 Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.753663 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-759ffd8674-fjwkq" event={"ID":"66ea68ed-1808-416c-b0e8-a2682d3d3b1f","Type":"ContainerStarted","Data":"ac45923e9c7627031a4f5ad1d74f087ca83e61cb8fccd12c17c502a87d1b1fe4"} Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.778348 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.778321655 podStartE2EDuration="19.778321655s" podCreationTimestamp="2025-12-04 17:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:01.767884668 +0000 UTC m=+1833.128959070" watchObservedRunningTime="2025-12-04 17:57:01.778321655 +0000 UTC m=+1833.139396057" Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.837627 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xxthl"] Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.850654 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xxthl"] Dec 04 17:57:01 crc kubenswrapper[4948]: I1204 17:57:01.913803 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:57:01 crc kubenswrapper[4948]: E1204 17:57:01.914263 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.072079 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.113264 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-logs\") pod \"a563ce20-3f9a-4fed-972b-bf00209315b2\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.113315 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a563ce20-3f9a-4fed-972b-bf00209315b2\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.113381 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-httpd-run\") pod \"a563ce20-3f9a-4fed-972b-bf00209315b2\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.113408 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cp59\" (UniqueName: \"kubernetes.io/projected/a563ce20-3f9a-4fed-972b-bf00209315b2-kube-api-access-6cp59\") pod \"a563ce20-3f9a-4fed-972b-bf00209315b2\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.113534 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-config-data\") pod \"a563ce20-3f9a-4fed-972b-bf00209315b2\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.113563 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-scripts\") pod \"a563ce20-3f9a-4fed-972b-bf00209315b2\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.113662 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-combined-ca-bundle\") pod \"a563ce20-3f9a-4fed-972b-bf00209315b2\" (UID: \"a563ce20-3f9a-4fed-972b-bf00209315b2\") " Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.114315 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a563ce20-3f9a-4fed-972b-bf00209315b2" (UID: "a563ce20-3f9a-4fed-972b-bf00209315b2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.114572 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-logs" (OuterVolumeSpecName: "logs") pod "a563ce20-3f9a-4fed-972b-bf00209315b2" (UID: "a563ce20-3f9a-4fed-972b-bf00209315b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.128238 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a563ce20-3f9a-4fed-972b-bf00209315b2" (UID: "a563ce20-3f9a-4fed-972b-bf00209315b2"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.128531 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-scripts" (OuterVolumeSpecName: "scripts") pod "a563ce20-3f9a-4fed-972b-bf00209315b2" (UID: "a563ce20-3f9a-4fed-972b-bf00209315b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.135607 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a563ce20-3f9a-4fed-972b-bf00209315b2-kube-api-access-6cp59" (OuterVolumeSpecName: "kube-api-access-6cp59") pod "a563ce20-3f9a-4fed-972b-bf00209315b2" (UID: "a563ce20-3f9a-4fed-972b-bf00209315b2"). InnerVolumeSpecName "kube-api-access-6cp59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.152641 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a563ce20-3f9a-4fed-972b-bf00209315b2" (UID: "a563ce20-3f9a-4fed-972b-bf00209315b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.167588 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-config-data" (OuterVolumeSpecName: "config-data") pod "a563ce20-3f9a-4fed-972b-bf00209315b2" (UID: "a563ce20-3f9a-4fed-972b-bf00209315b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.216148 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.216179 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.216211 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.216222 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cp59\" (UniqueName: \"kubernetes.io/projected/a563ce20-3f9a-4fed-972b-bf00209315b2-kube-api-access-6cp59\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.216235 4948 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a563ce20-3f9a-4fed-972b-bf00209315b2-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.216248 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.216260 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a563ce20-3f9a-4fed-972b-bf00209315b2-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.240317 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.318101 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.364641 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b7b8cbd95-z6gmw"] Dec 04 17:57:02 crc kubenswrapper[4948]: E1204 17:57:02.365026 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353c6df2-5698-49e5-969e-fac665a5e6e6" containerName="init" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.365045 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="353c6df2-5698-49e5-969e-fac665a5e6e6" containerName="init" Dec 04 17:57:02 crc kubenswrapper[4948]: E1204 17:57:02.365097 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerName="glance-log" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.365110 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerName="glance-log" Dec 04 17:57:02 crc kubenswrapper[4948]: E1204 17:57:02.365147 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerName="glance-httpd" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.365156 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerName="glance-httpd" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.365356 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerName="glance-httpd" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.365379 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="353c6df2-5698-49e5-969e-fac665a5e6e6" containerName="init" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.365391 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" containerName="glance-log" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.370643 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.377614 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.377697 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.396757 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7b8cbd95-z6gmw"] Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.419904 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-public-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.419968 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-ovndb-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.420033 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-httpd-config\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.420061 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-config\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.420092 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-internal-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.420121 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpdfb\" (UniqueName: \"kubernetes.io/projected/0fc74dcc-f8d8-4852-913a-77cb4526eed7-kube-api-access-mpdfb\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.420148 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-combined-ca-bundle\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.521092 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-httpd-config\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.521143 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-config\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.521160 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-internal-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.521190 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpdfb\" (UniqueName: \"kubernetes.io/projected/0fc74dcc-f8d8-4852-913a-77cb4526eed7-kube-api-access-mpdfb\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.521219 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-combined-ca-bundle\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.521248 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-public-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.521288 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-ovndb-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.524911 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-ovndb-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.525667 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-combined-ca-bundle\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.527128 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-internal-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.527505 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-public-tls-certs\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.531466 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-httpd-config\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.534743 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-config\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.545895 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpdfb\" (UniqueName: \"kubernetes.io/projected/0fc74dcc-f8d8-4852-913a-77cb4526eed7-kube-api-access-mpdfb\") pod \"neutron-6b7b8cbd95-z6gmw\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.708103 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.764204 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-759ffd8674-fjwkq" event={"ID":"66ea68ed-1808-416c-b0e8-a2682d3d3b1f","Type":"ContainerStarted","Data":"1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f"} Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.764509 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.764527 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-759ffd8674-fjwkq" event={"ID":"66ea68ed-1808-416c-b0e8-a2682d3d3b1f","Type":"ContainerStarted","Data":"8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e"} Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.768537 4948 generic.go:334] "Generic (PLEG): container finished" podID="a7701ffc-1871-4c71-8048-23b425f47dec" containerID="e34ab80f8360f27401f94c95850d2fccb2426f458633fb5b5aee2d12a1f864f1" exitCode=0 Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.768637 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" event={"ID":"a7701ffc-1871-4c71-8048-23b425f47dec","Type":"ContainerDied","Data":"e34ab80f8360f27401f94c95850d2fccb2426f458633fb5b5aee2d12a1f864f1"} Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.783375 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7zdlz" event={"ID":"8913b68d-4b7f-4a2e-b097-a60b0f557827","Type":"ContainerStarted","Data":"1d1c539929d00b4f50893be637194adb40ec4e3377a6f0bd73cc9431dafdb02f"} Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.790333 4948 generic.go:334] "Generic (PLEG): container finished" podID="1bd09899-e64d-4b12-b604-dcd87d9c868b" containerID="ff7daf90f4531c9b225c2a333f522204056bff71a75157eece31bc57ae7f99af" exitCode=0 Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.790405 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xxfqv" event={"ID":"1bd09899-e64d-4b12-b604-dcd87d9c868b","Type":"ContainerDied","Data":"ff7daf90f4531c9b225c2a333f522204056bff71a75157eece31bc57ae7f99af"} Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.832081 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a563ce20-3f9a-4fed-972b-bf00209315b2","Type":"ContainerDied","Data":"721a9e2a3a3fe0720872365680ab265e665f3e538e95c40a95e1cdaec1925512"} Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.832143 4948 scope.go:117] "RemoveContainer" containerID="64d3bd30618560134ad1780ad81211017de93b4ca3a2cbabc9ae78e1066bbd6b" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.832281 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.842517 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-759ffd8674-fjwkq" podStartSLOduration=2.842490582 podStartE2EDuration="2.842490582s" podCreationTimestamp="2025-12-04 17:57:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:02.801513096 +0000 UTC m=+1834.162587498" watchObservedRunningTime="2025-12-04 17:57:02.842490582 +0000 UTC m=+1834.203564984" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.875280 4948 generic.go:334] "Generic (PLEG): container finished" podID="6ba827a1-34af-4a65-8b05-18d80df57324" containerID="4e80c18e8fadf5646e4c1128551c4bd8744b986403bb55e6f92e5a0338335034" exitCode=0 Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.875377 4948 generic.go:334] "Generic (PLEG): container finished" podID="6ba827a1-34af-4a65-8b05-18d80df57324" containerID="0d9bd5f05df11b25ce7671a425cd507c88b9ea45659b48b77b88a070061bf1ba" exitCode=143 Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.875408 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ba827a1-34af-4a65-8b05-18d80df57324","Type":"ContainerDied","Data":"4e80c18e8fadf5646e4c1128551c4bd8744b986403bb55e6f92e5a0338335034"} Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.875452 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ba827a1-34af-4a65-8b05-18d80df57324","Type":"ContainerDied","Data":"0d9bd5f05df11b25ce7671a425cd507c88b9ea45659b48b77b88a070061bf1ba"} Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.937617 4948 scope.go:117] "RemoveContainer" containerID="1086202a8a9df8e39256b991874791e58186abcfa2a20f4082f0d19c8e76202a" Dec 04 17:57:02 crc kubenswrapper[4948]: I1204 17:57:02.946456 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7zdlz" podStartSLOduration=3.477412736 podStartE2EDuration="29.946428322s" podCreationTimestamp="2025-12-04 17:56:33 +0000 UTC" firstStartedPulling="2025-12-04 17:56:35.913476445 +0000 UTC m=+1807.274550847" lastFinishedPulling="2025-12-04 17:57:02.382492031 +0000 UTC m=+1833.743566433" observedRunningTime="2025-12-04 17:57:02.85193778 +0000 UTC m=+1834.213012182" watchObservedRunningTime="2025-12-04 17:57:02.946428322 +0000 UTC m=+1834.307502724" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.141019 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="353c6df2-5698-49e5-969e-fac665a5e6e6" path="/var/lib/kubelet/pods/353c6df2-5698-49e5-969e-fac665a5e6e6/volumes" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.269928 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.277648 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.287740 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.289303 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.295915 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.296179 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.314210 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.379638 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mxfm\" (UniqueName: \"kubernetes.io/projected/d1cb425a-165a-4ba6-9316-3b8954b2b395-kube-api-access-4mxfm\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.379697 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.379766 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.379794 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.379984 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.380151 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.380190 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.380229 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.481389 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.481437 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.481462 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.481508 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mxfm\" (UniqueName: \"kubernetes.io/projected/d1cb425a-165a-4ba6-9316-3b8954b2b395-kube-api-access-4mxfm\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.481534 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.481571 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.481610 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.481651 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.482047 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.482127 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.484662 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.487708 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.491368 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.494935 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.497237 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.500162 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mxfm\" (UniqueName: \"kubernetes.io/projected/d1cb425a-165a-4ba6-9316-3b8954b2b395-kube-api-access-4mxfm\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.536243 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.589527 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b7b8cbd95-z6gmw"] Dec 04 17:57:03 crc kubenswrapper[4948]: I1204 17:57:03.629028 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:04 crc kubenswrapper[4948]: I1204 17:57:04.935491 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a563ce20-3f9a-4fed-972b-bf00209315b2" path="/var/lib/kubelet/pods/a563ce20-3f9a-4fed-972b-bf00209315b2/volumes" Dec 04 17:57:09 crc kubenswrapper[4948]: I1204 17:57:09.991182 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ba827a1-34af-4a65-8b05-18d80df57324","Type":"ContainerDied","Data":"d1c6b4551a70181d4e6378086d8cdbb40dc648a5a4cf07f5594c7b43309927dc"} Dec 04 17:57:09 crc kubenswrapper[4948]: I1204 17:57:09.991868 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1c6b4551a70181d4e6378086d8cdbb40dc648a5a4cf07f5594c7b43309927dc" Dec 04 17:57:09 crc kubenswrapper[4948]: I1204 17:57:09.993545 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7b8cbd95-z6gmw" event={"ID":"0fc74dcc-f8d8-4852-913a-77cb4526eed7","Type":"ContainerStarted","Data":"2d8100129d8b34199d7f61946d7edb63140d5479620d97db65067c0b26d93c47"} Dec 04 17:57:09 crc kubenswrapper[4948]: I1204 17:57:09.995098 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xxfqv" event={"ID":"1bd09899-e64d-4b12-b604-dcd87d9c868b","Type":"ContainerDied","Data":"3ae9e493d43b87480463e8f38ed6b12425463e49f0b4c6f6d7240ef507d98a02"} Dec 04 17:57:09 crc kubenswrapper[4948]: I1204 17:57:09.995124 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ae9e493d43b87480463e8f38ed6b12425463e49f0b4c6f6d7240ef507d98a02" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.080875 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.101881 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.140762 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-config-data\") pod \"6ba827a1-34af-4a65-8b05-18d80df57324\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.140883 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-combined-ca-bundle\") pod \"6ba827a1-34af-4a65-8b05-18d80df57324\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.141032 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-logs\") pod \"6ba827a1-34af-4a65-8b05-18d80df57324\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.141099 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-combined-ca-bundle\") pod \"1bd09899-e64d-4b12-b604-dcd87d9c868b\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.141139 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-httpd-run\") pod \"6ba827a1-34af-4a65-8b05-18d80df57324\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.141156 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"6ba827a1-34af-4a65-8b05-18d80df57324\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.141696 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-logs" (OuterVolumeSpecName: "logs") pod "6ba827a1-34af-4a65-8b05-18d80df57324" (UID: "6ba827a1-34af-4a65-8b05-18d80df57324"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.142213 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ba827a1-34af-4a65-8b05-18d80df57324" (UID: "6ba827a1-34af-4a65-8b05-18d80df57324"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.142355 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xtdh\" (UniqueName: \"kubernetes.io/projected/6ba827a1-34af-4a65-8b05-18d80df57324-kube-api-access-9xtdh\") pod \"6ba827a1-34af-4a65-8b05-18d80df57324\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.142428 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-db-sync-config-data\") pod \"1bd09899-e64d-4b12-b604-dcd87d9c868b\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.142522 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-scripts\") pod \"6ba827a1-34af-4a65-8b05-18d80df57324\" (UID: \"6ba827a1-34af-4a65-8b05-18d80df57324\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.142572 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fzwj\" (UniqueName: \"kubernetes.io/projected/1bd09899-e64d-4b12-b604-dcd87d9c868b-kube-api-access-5fzwj\") pod \"1bd09899-e64d-4b12-b604-dcd87d9c868b\" (UID: \"1bd09899-e64d-4b12-b604-dcd87d9c868b\") " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.143080 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.143098 4948 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ba827a1-34af-4a65-8b05-18d80df57324-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.158181 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-scripts" (OuterVolumeSpecName: "scripts") pod "6ba827a1-34af-4a65-8b05-18d80df57324" (UID: "6ba827a1-34af-4a65-8b05-18d80df57324"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.158345 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd09899-e64d-4b12-b604-dcd87d9c868b-kube-api-access-5fzwj" (OuterVolumeSpecName: "kube-api-access-5fzwj") pod "1bd09899-e64d-4b12-b604-dcd87d9c868b" (UID: "1bd09899-e64d-4b12-b604-dcd87d9c868b"). InnerVolumeSpecName "kube-api-access-5fzwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.158372 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba827a1-34af-4a65-8b05-18d80df57324-kube-api-access-9xtdh" (OuterVolumeSpecName: "kube-api-access-9xtdh") pod "6ba827a1-34af-4a65-8b05-18d80df57324" (UID: "6ba827a1-34af-4a65-8b05-18d80df57324"). InnerVolumeSpecName "kube-api-access-9xtdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.158428 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1bd09899-e64d-4b12-b604-dcd87d9c868b" (UID: "1bd09899-e64d-4b12-b604-dcd87d9c868b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.159509 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "6ba827a1-34af-4a65-8b05-18d80df57324" (UID: "6ba827a1-34af-4a65-8b05-18d80df57324"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.172704 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ba827a1-34af-4a65-8b05-18d80df57324" (UID: "6ba827a1-34af-4a65-8b05-18d80df57324"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.172711 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bd09899-e64d-4b12-b604-dcd87d9c868b" (UID: "1bd09899-e64d-4b12-b604-dcd87d9c868b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.196814 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-config-data" (OuterVolumeSpecName: "config-data") pod "6ba827a1-34af-4a65-8b05-18d80df57324" (UID: "6ba827a1-34af-4a65-8b05-18d80df57324"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.244751 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.244791 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.244823 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.244858 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xtdh\" (UniqueName: \"kubernetes.io/projected/6ba827a1-34af-4a65-8b05-18d80df57324-kube-api-access-9xtdh\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.244870 4948 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1bd09899-e64d-4b12-b604-dcd87d9c868b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.244879 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.244888 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fzwj\" (UniqueName: \"kubernetes.io/projected/1bd09899-e64d-4b12-b604-dcd87d9c868b-kube-api-access-5fzwj\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.244899 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba827a1-34af-4a65-8b05-18d80df57324-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.261231 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.346678 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:10 crc kubenswrapper[4948]: I1204 17:57:10.405983 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:57:10 crc kubenswrapper[4948]: W1204 17:57:10.407744 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1cb425a_165a_4ba6_9316_3b8954b2b395.slice/crio-9db2ceb9c205d0c114fd59c759e49100776d4524b356a1e0554808a9dd2d66e6 WatchSource:0}: Error finding container 9db2ceb9c205d0c114fd59c759e49100776d4524b356a1e0554808a9dd2d66e6: Status 404 returned error can't find the container with id 9db2ceb9c205d0c114fd59c759e49100776d4524b356a1e0554808a9dd2d66e6 Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.007102 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" event={"ID":"a7701ffc-1871-4c71-8048-23b425f47dec","Type":"ContainerStarted","Data":"eb0530a13dcb90ad020624c83b0c4f0ba66724bd72d7d7b74e315e2bb6873966"} Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.008952 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7b8cbd95-z6gmw" event={"ID":"0fc74dcc-f8d8-4852-913a-77cb4526eed7","Type":"ContainerStarted","Data":"86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0"} Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.010563 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xxfqv" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.010564 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1cb425a-165a-4ba6-9316-3b8954b2b395","Type":"ContainerStarted","Data":"dfd1336b1739747d81929cc4e0ad9e6bb803c019fe5011d59f6ebad326abb1c8"} Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.010568 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.010661 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1cb425a-165a-4ba6-9316-3b8954b2b395","Type":"ContainerStarted","Data":"9db2ceb9c205d0c114fd59c759e49100776d4524b356a1e0554808a9dd2d66e6"} Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.037862 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.045644 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.071717 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:57:11 crc kubenswrapper[4948]: E1204 17:57:11.072417 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd09899-e64d-4b12-b604-dcd87d9c868b" containerName="barbican-db-sync" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.072441 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd09899-e64d-4b12-b604-dcd87d9c868b" containerName="barbican-db-sync" Dec 04 17:57:11 crc kubenswrapper[4948]: E1204 17:57:11.072477 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" containerName="glance-httpd" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.072485 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" containerName="glance-httpd" Dec 04 17:57:11 crc kubenswrapper[4948]: E1204 17:57:11.072513 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" containerName="glance-log" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.072520 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" containerName="glance-log" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.072727 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" containerName="glance-log" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.072753 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd09899-e64d-4b12-b604-dcd87d9c868b" containerName="barbican-db-sync" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.072765 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" containerName="glance-httpd" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.074026 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.081557 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.090885 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.091171 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.158928 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.159245 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.159361 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-logs\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.160445 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.160908 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.160993 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.163403 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxrhk\" (UniqueName: \"kubernetes.io/projected/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-kube-api-access-lxrhk\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.164983 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.267227 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.267299 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-logs\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.267326 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.267393 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.267419 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.267451 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxrhk\" (UniqueName: \"kubernetes.io/projected/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-kube-api-access-lxrhk\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.267475 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.267503 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.268884 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.275968 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.334585 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.334923 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-logs\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.335027 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.335030 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.335600 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.338161 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxrhk\" (UniqueName: \"kubernetes.io/projected/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-kube-api-access-lxrhk\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.352265 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.392757 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.535941 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-54fb4df596-9xk9m"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.543029 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.550771 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.551035 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.551187 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9vqcg" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.572579 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-54fb4df596-9xk9m"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.575000 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.575070 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data-custom\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.575111 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e905edc7-cd78-48c2-9192-fb18e1d193ac-logs\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.575209 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-combined-ca-bundle\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.575246 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqvbt\" (UniqueName: \"kubernetes.io/projected/e905edc7-cd78-48c2-9192-fb18e1d193ac-kube-api-access-fqvbt\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.608216 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6dbb7d984c-hzlwz"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.609595 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.614656 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.658794 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6dbb7d984c-hzlwz"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.679086 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.679395 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data-custom\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.679552 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e905edc7-cd78-48c2-9192-fb18e1d193ac-logs\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.679679 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data-custom\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.679757 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.679858 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxnt2\" (UniqueName: \"kubernetes.io/projected/c94e22e0-c0d1-4233-b21c-9860d204c068-kube-api-access-mxnt2\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.679927 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-combined-ca-bundle\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.680015 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqvbt\" (UniqueName: \"kubernetes.io/projected/e905edc7-cd78-48c2-9192-fb18e1d193ac-kube-api-access-fqvbt\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.680103 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-combined-ca-bundle\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.680174 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94e22e0-c0d1-4233-b21c-9860d204c068-logs\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.680574 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e905edc7-cd78-48c2-9192-fb18e1d193ac-logs\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.687112 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-n5kht"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.715535 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.716092 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-combined-ca-bundle\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.716437 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data-custom\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.747473 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqvbt\" (UniqueName: \"kubernetes.io/projected/e905edc7-cd78-48c2-9192-fb18e1d193ac-kube-api-access-fqvbt\") pod \"barbican-keystone-listener-54fb4df596-9xk9m\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.782419 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data-custom\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.782472 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.782512 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxnt2\" (UniqueName: \"kubernetes.io/projected/c94e22e0-c0d1-4233-b21c-9860d204c068-kube-api-access-mxnt2\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.782553 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-combined-ca-bundle\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.782586 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94e22e0-c0d1-4233-b21c-9860d204c068-logs\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.783589 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94e22e0-c0d1-4233-b21c-9860d204c068-logs\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.788386 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.805653 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data-custom\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.806429 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-combined-ca-bundle\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.810009 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxnt2\" (UniqueName: \"kubernetes.io/projected/c94e22e0-c0d1-4233-b21c-9860d204c068-kube-api-access-mxnt2\") pod \"barbican-worker-6dbb7d984c-hzlwz\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.818883 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-zv76w"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.820427 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.826786 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-zv76w"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.837868 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f549dc79b-r6txc"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.842768 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.844899 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.851096 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f549dc79b-r6txc"] Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.863545 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.883561 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-logs\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.883685 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data-custom\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.883708 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.883724 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.883760 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.883871 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-combined-ca-bundle\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.883971 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.884000 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7g4z\" (UniqueName: \"kubernetes.io/projected/7fa9312c-3146-4a5e-9db6-acc251aa60c6-kube-api-access-x7g4z\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.884072 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.884103 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.884250 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwfcw\" (UniqueName: \"kubernetes.io/projected/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-kube-api-access-vwfcw\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.953427 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988016 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-combined-ca-bundle\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988088 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988137 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988165 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7g4z\" (UniqueName: \"kubernetes.io/projected/7fa9312c-3146-4a5e-9db6-acc251aa60c6-kube-api-access-x7g4z\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988194 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988234 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988350 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwfcw\" (UniqueName: \"kubernetes.io/projected/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-kube-api-access-vwfcw\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988432 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-logs\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988515 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data-custom\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988533 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.988552 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.990381 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.991355 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:11 crc kubenswrapper[4948]: I1204 17:57:11.991934 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-combined-ca-bundle\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.002232 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-logs\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.007132 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data-custom\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.008426 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.008529 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.009835 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7g4z\" (UniqueName: \"kubernetes.io/projected/7fa9312c-3146-4a5e-9db6-acc251aa60c6-kube-api-access-x7g4z\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.010627 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.010971 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-zv76w\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.011921 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwfcw\" (UniqueName: \"kubernetes.io/projected/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-kube-api-access-vwfcw\") pod \"barbican-api-5f549dc79b-r6txc\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.024660 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.054423 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" podStartSLOduration=13.054405435 podStartE2EDuration="13.054405435s" podCreationTimestamp="2025-12-04 17:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:12.04133697 +0000 UTC m=+1843.402411372" watchObservedRunningTime="2025-12-04 17:57:12.054405435 +0000 UTC m=+1843.415479837" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.201358 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.210755 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:12 crc kubenswrapper[4948]: I1204 17:57:12.927184 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba827a1-34af-4a65-8b05-18d80df57324" path="/var/lib/kubelet/pods/6ba827a1-34af-4a65-8b05-18d80df57324/volumes" Dec 04 17:57:13 crc kubenswrapper[4948]: I1204 17:57:13.038822 4948 generic.go:334] "Generic (PLEG): container finished" podID="f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" containerID="f64cbf7b6e43ed96dc6f100a33115d3b1f5b0f8fe82df36ecfd2f69eebd3aea8" exitCode=0 Dec 04 17:57:13 crc kubenswrapper[4948]: I1204 17:57:13.038905 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rjcpf" event={"ID":"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a","Type":"ContainerDied","Data":"f64cbf7b6e43ed96dc6f100a33115d3b1f5b0f8fe82df36ecfd2f69eebd3aea8"} Dec 04 17:57:13 crc kubenswrapper[4948]: I1204 17:57:13.039104 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" podUID="a7701ffc-1871-4c71-8048-23b425f47dec" containerName="dnsmasq-dns" containerID="cri-o://eb0530a13dcb90ad020624c83b0c4f0ba66724bd72d7d7b74e315e2bb6873966" gracePeriod=10 Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.058121 4948 generic.go:334] "Generic (PLEG): container finished" podID="a7701ffc-1871-4c71-8048-23b425f47dec" containerID="eb0530a13dcb90ad020624c83b0c4f0ba66724bd72d7d7b74e315e2bb6873966" exitCode=0 Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.058320 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" event={"ID":"a7701ffc-1871-4c71-8048-23b425f47dec","Type":"ContainerDied","Data":"eb0530a13dcb90ad020624c83b0c4f0ba66724bd72d7d7b74e315e2bb6873966"} Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.229972 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d5f54fb74-68pcc"] Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.231473 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.245975 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.254232 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.256749 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-combined-ca-bundle\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.256830 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data-custom\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.256906 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-logs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.256980 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw92t\" (UniqueName: \"kubernetes.io/projected/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-kube-api-access-vw92t\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.257174 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.257211 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-internal-tls-certs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.257291 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-public-tls-certs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.259984 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d5f54fb74-68pcc"] Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.359243 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-logs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.359602 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw92t\" (UniqueName: \"kubernetes.io/projected/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-kube-api-access-vw92t\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.359652 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-logs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.359747 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.359797 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-internal-tls-certs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.359884 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-public-tls-certs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.360112 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-combined-ca-bundle\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.360142 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data-custom\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.371187 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-combined-ca-bundle\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.373708 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.377902 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data-custom\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.379805 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-public-tls-certs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.383520 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-internal-tls-certs\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.384559 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw92t\" (UniqueName: \"kubernetes.io/projected/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-kube-api-access-vw92t\") pod \"barbican-api-d5f54fb74-68pcc\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.610038 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:14 crc kubenswrapper[4948]: I1204 17:57:14.914237 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:57:14 crc kubenswrapper[4948]: E1204 17:57:14.914560 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:57:15 crc kubenswrapper[4948]: I1204 17:57:15.371569 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" podUID="a7701ffc-1871-4c71-8048-23b425f47dec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Dec 04 17:57:17 crc kubenswrapper[4948]: I1204 17:57:17.991812 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.084227 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-scripts\") pod \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.084303 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-combined-ca-bundle\") pod \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.084350 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtff5\" (UniqueName: \"kubernetes.io/projected/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-kube-api-access-dtff5\") pod \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.084439 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-fernet-keys\") pod \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.084472 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-config-data\") pod \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.084535 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-credential-keys\") pod \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\" (UID: \"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a\") " Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.106261 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" (UID: "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.159228 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" (UID: "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.162136 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-scripts" (OuterVolumeSpecName: "scripts") pod "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" (UID: "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.165524 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rjcpf" event={"ID":"f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a","Type":"ContainerDied","Data":"73e4851cd1c68069fef829f85b47587493def1fff470392cf31d1ce17eae120a"} Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.165564 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73e4851cd1c68069fef829f85b47587493def1fff470392cf31d1ce17eae120a" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.165630 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rjcpf" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.172256 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-kube-api-access-dtff5" (OuterVolumeSpecName: "kube-api-access-dtff5") pod "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" (UID: "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a"). InnerVolumeSpecName "kube-api-access-dtff5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.186022 4948 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.186112 4948 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.186123 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.186133 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtff5\" (UniqueName: \"kubernetes.io/projected/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-kube-api-access-dtff5\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.203235 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-config-data" (OuterVolumeSpecName: "config-data") pod "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" (UID: "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.211029 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" (UID: "f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.287378 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.287415 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:18 crc kubenswrapper[4948]: I1204 17:57:18.831301 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.000873 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-svc\") pod \"a7701ffc-1871-4c71-8048-23b425f47dec\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.001214 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-nb\") pod \"a7701ffc-1871-4c71-8048-23b425f47dec\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.001333 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-config\") pod \"a7701ffc-1871-4c71-8048-23b425f47dec\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.001385 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnzd8\" (UniqueName: \"kubernetes.io/projected/a7701ffc-1871-4c71-8048-23b425f47dec-kube-api-access-nnzd8\") pod \"a7701ffc-1871-4c71-8048-23b425f47dec\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.001447 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-swift-storage-0\") pod \"a7701ffc-1871-4c71-8048-23b425f47dec\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.001480 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-sb\") pod \"a7701ffc-1871-4c71-8048-23b425f47dec\" (UID: \"a7701ffc-1871-4c71-8048-23b425f47dec\") " Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.005816 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7701ffc-1871-4c71-8048-23b425f47dec-kube-api-access-nnzd8" (OuterVolumeSpecName: "kube-api-access-nnzd8") pod "a7701ffc-1871-4c71-8048-23b425f47dec" (UID: "a7701ffc-1871-4c71-8048-23b425f47dec"). InnerVolumeSpecName "kube-api-access-nnzd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.063452 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a7701ffc-1871-4c71-8048-23b425f47dec" (UID: "a7701ffc-1871-4c71-8048-23b425f47dec"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.071958 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a7701ffc-1871-4c71-8048-23b425f47dec" (UID: "a7701ffc-1871-4c71-8048-23b425f47dec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.076919 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-config" (OuterVolumeSpecName: "config") pod "a7701ffc-1871-4c71-8048-23b425f47dec" (UID: "a7701ffc-1871-4c71-8048-23b425f47dec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.077649 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a7701ffc-1871-4c71-8048-23b425f47dec" (UID: "a7701ffc-1871-4c71-8048-23b425f47dec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.093950 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a7701ffc-1871-4c71-8048-23b425f47dec" (UID: "a7701ffc-1871-4c71-8048-23b425f47dec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.104165 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.104204 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.104216 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.104226 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnzd8\" (UniqueName: \"kubernetes.io/projected/a7701ffc-1871-4c71-8048-23b425f47dec-kube-api-access-nnzd8\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.104236 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.104245 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7701ffc-1871-4c71-8048-23b425f47dec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.108321 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-54df7858f8-fz456"] Dec 04 17:57:19 crc kubenswrapper[4948]: E1204 17:57:19.108679 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7701ffc-1871-4c71-8048-23b425f47dec" containerName="init" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.108695 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7701ffc-1871-4c71-8048-23b425f47dec" containerName="init" Dec 04 17:57:19 crc kubenswrapper[4948]: E1204 17:57:19.108723 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" containerName="keystone-bootstrap" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.108730 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" containerName="keystone-bootstrap" Dec 04 17:57:19 crc kubenswrapper[4948]: E1204 17:57:19.108754 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7701ffc-1871-4c71-8048-23b425f47dec" containerName="dnsmasq-dns" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.108759 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7701ffc-1871-4c71-8048-23b425f47dec" containerName="dnsmasq-dns" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.109098 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" containerName="keystone-bootstrap" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.109129 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7701ffc-1871-4c71-8048-23b425f47dec" containerName="dnsmasq-dns" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.109625 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.115693 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.115977 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ltmz5" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.116172 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.116333 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.116504 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.118012 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.129061 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-54df7858f8-fz456"] Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.176449 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" event={"ID":"a7701ffc-1871-4c71-8048-23b425f47dec","Type":"ContainerDied","Data":"3202098028b2aec5df8b1d35b1597ce08799d0d96e1b1541bf499b9b21f0b1be"} Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.176554 4948 scope.go:117] "RemoveContainer" containerID="eb0530a13dcb90ad020624c83b0c4f0ba66724bd72d7d7b74e315e2bb6873966" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.176683 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-n5kht" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.207762 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-internal-tls-certs\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.208112 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-config-data\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.208137 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-public-tls-certs\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.208159 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-fernet-keys\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.208189 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-credential-keys\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.208214 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqvkv\" (UniqueName: \"kubernetes.io/projected/fb168081-824d-45ef-a815-b96d44b58b7c-kube-api-access-rqvkv\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.208243 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-combined-ca-bundle\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.208282 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-scripts\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.215330 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-n5kht"] Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.223132 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-n5kht"] Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.229896 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-54fb4df596-9xk9m"] Dec 04 17:57:19 crc kubenswrapper[4948]: W1204 17:57:19.274288 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode905edc7_cd78_48c2_9192_fb18e1d193ac.slice/crio-d5cf4dfb906cc721346b4bc7c9c652ff344ea93e5dfdf569189b114e8ff3fcdc WatchSource:0}: Error finding container d5cf4dfb906cc721346b4bc7c9c652ff344ea93e5dfdf569189b114e8ff3fcdc: Status 404 returned error can't find the container with id d5cf4dfb906cc721346b4bc7c9c652ff344ea93e5dfdf569189b114e8ff3fcdc Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.310117 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-internal-tls-certs\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.310171 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-public-tls-certs\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.310190 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-config-data\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.310207 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-fernet-keys\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.310236 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-credential-keys\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.310464 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvkv\" (UniqueName: \"kubernetes.io/projected/fb168081-824d-45ef-a815-b96d44b58b7c-kube-api-access-rqvkv\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.310498 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-combined-ca-bundle\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.310535 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-scripts\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.313530 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-public-tls-certs\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.313768 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-combined-ca-bundle\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.313819 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-scripts\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.314765 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-internal-tls-certs\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.314964 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-config-data\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.319315 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-credential-keys\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.319668 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-fernet-keys\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.327234 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqvkv\" (UniqueName: \"kubernetes.io/projected/fb168081-824d-45ef-a815-b96d44b58b7c-kube-api-access-rqvkv\") pod \"keystone-54df7858f8-fz456\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.476662 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.679196 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d5f54fb74-68pcc"] Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.764494 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6dbb7d984c-hzlwz"] Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.826482 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-zv76w"] Dec 04 17:57:19 crc kubenswrapper[4948]: I1204 17:57:19.932941 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:57:20 crc kubenswrapper[4948]: I1204 17:57:20.184598 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" event={"ID":"e905edc7-cd78-48c2-9192-fb18e1d193ac","Type":"ContainerStarted","Data":"d5cf4dfb906cc721346b4bc7c9c652ff344ea93e5dfdf569189b114e8ff3fcdc"} Dec 04 17:57:20 crc kubenswrapper[4948]: I1204 17:57:20.900569 4948 scope.go:117] "RemoveContainer" containerID="e34ab80f8360f27401f94c95850d2fccb2426f458633fb5b5aee2d12a1f864f1" Dec 04 17:57:20 crc kubenswrapper[4948]: W1204 17:57:20.934192 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf6e1e9b_1f4a_46e4_80e7_3acd4993b48e.slice/crio-70ff5ceaaddaf4edf298be3e78cb83837655b116f6fe50a7a47025e0a2c02d89 WatchSource:0}: Error finding container 70ff5ceaaddaf4edf298be3e78cb83837655b116f6fe50a7a47025e0a2c02d89: Status 404 returned error can't find the container with id 70ff5ceaaddaf4edf298be3e78cb83837655b116f6fe50a7a47025e0a2c02d89 Dec 04 17:57:20 crc kubenswrapper[4948]: I1204 17:57:20.940447 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7701ffc-1871-4c71-8048-23b425f47dec" path="/var/lib/kubelet/pods/a7701ffc-1871-4c71-8048-23b425f47dec/volumes" Dec 04 17:57:21 crc kubenswrapper[4948]: I1204 17:57:21.254440 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e","Type":"ContainerStarted","Data":"70ff5ceaaddaf4edf298be3e78cb83837655b116f6fe50a7a47025e0a2c02d89"} Dec 04 17:57:21 crc kubenswrapper[4948]: I1204 17:57:21.255385 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" event={"ID":"c94e22e0-c0d1-4233-b21c-9860d204c068","Type":"ContainerStarted","Data":"3d559e92a3f6ede06a2024ce80699b72678702bb302c5dab59e5b0260a46a82b"} Dec 04 17:57:21 crc kubenswrapper[4948]: I1204 17:57:21.258204 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" event={"ID":"7fa9312c-3146-4a5e-9db6-acc251aa60c6","Type":"ContainerStarted","Data":"0c417f06ae01252942678a7625b97553ed5bcbea9c3576045aee870c28c81cc4"} Dec 04 17:57:21 crc kubenswrapper[4948]: I1204 17:57:21.267795 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d5f54fb74-68pcc" event={"ID":"be3e0d09-a01a-4f1c-9fbd-60a23a823e31","Type":"ContainerStarted","Data":"6efe33eec024e9574e9af94a9bbfb0beadb939e856ba7628ea939a53809e5a5b"} Dec 04 17:57:21 crc kubenswrapper[4948]: I1204 17:57:21.328402 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f549dc79b-r6txc"] Dec 04 17:57:21 crc kubenswrapper[4948]: W1204 17:57:21.352668 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod821bbb80_f2bb_4806_93d4_7c4e74a6c39e.slice/crio-fa666fdda0328f94228b8284b9c5853809a9c5c8d24615290e56a522a2d55a17 WatchSource:0}: Error finding container fa666fdda0328f94228b8284b9c5853809a9c5c8d24615290e56a522a2d55a17: Status 404 returned error can't find the container with id fa666fdda0328f94228b8284b9c5853809a9c5c8d24615290e56a522a2d55a17 Dec 04 17:57:21 crc kubenswrapper[4948]: I1204 17:57:21.654617 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-54df7858f8-fz456"] Dec 04 17:57:21 crc kubenswrapper[4948]: W1204 17:57:21.675789 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb168081_824d_45ef_a815_b96d44b58b7c.slice/crio-cfaeeb113c88db3f384ca716647858c8a39f66434ea7680d9bece6c4039a5959 WatchSource:0}: Error finding container cfaeeb113c88db3f384ca716647858c8a39f66434ea7680d9bece6c4039a5959: Status 404 returned error can't find the container with id cfaeeb113c88db3f384ca716647858c8a39f66434ea7680d9bece6c4039a5959 Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.289633 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54df7858f8-fz456" event={"ID":"fb168081-824d-45ef-a815-b96d44b58b7c","Type":"ContainerStarted","Data":"42f8d6eb61951e718daf3a1c3876ae5812998b0025aa93db9e52773f8f045f77"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.290148 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54df7858f8-fz456" event={"ID":"fb168081-824d-45ef-a815-b96d44b58b7c","Type":"ContainerStarted","Data":"cfaeeb113c88db3f384ca716647858c8a39f66434ea7680d9bece6c4039a5959"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.290226 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.295181 4948 generic.go:334] "Generic (PLEG): container finished" podID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" containerID="0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e" exitCode=0 Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.295315 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" event={"ID":"7fa9312c-3146-4a5e-9db6-acc251aa60c6","Type":"ContainerDied","Data":"0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.301024 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d5f54fb74-68pcc" event={"ID":"be3e0d09-a01a-4f1c-9fbd-60a23a823e31","Type":"ContainerStarted","Data":"6b974c2237fa21ad81c9fc52a94fd34f294e0f3775cb43b085b2410092be36d4"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.301123 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d5f54fb74-68pcc" event={"ID":"be3e0d09-a01a-4f1c-9fbd-60a23a823e31","Type":"ContainerStarted","Data":"b6f89e0991c69b6cee27c2fa1ab82523519a1bb22b60f0d9a6b4e7f17f7f22bc"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.301773 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.301798 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.313424 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e","Type":"ContainerStarted","Data":"77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.315597 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7b8cbd95-z6gmw" event={"ID":"0fc74dcc-f8d8-4852-913a-77cb4526eed7","Type":"ContainerStarted","Data":"7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.315838 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.317236 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-54df7858f8-fz456" podStartSLOduration=3.317214154 podStartE2EDuration="3.317214154s" podCreationTimestamp="2025-12-04 17:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:22.305167649 +0000 UTC m=+1853.666242051" watchObservedRunningTime="2025-12-04 17:57:22.317214154 +0000 UTC m=+1853.678288576" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.327315 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerStarted","Data":"b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.333835 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d5f54fb74-68pcc" podStartSLOduration=8.333814993 podStartE2EDuration="8.333814993s" podCreationTimestamp="2025-12-04 17:57:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:22.323274352 +0000 UTC m=+1853.684348754" watchObservedRunningTime="2025-12-04 17:57:22.333814993 +0000 UTC m=+1853.694889395" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.343904 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1cb425a-165a-4ba6-9316-3b8954b2b395","Type":"ContainerStarted","Data":"9408661d75bd4de856da2a30fda585946339e006d4ebeff98d7f7fa2bb71d74b"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.353705 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f549dc79b-r6txc" event={"ID":"821bbb80-f2bb-4806-93d4-7c4e74a6c39e","Type":"ContainerStarted","Data":"6883ce38c46d69955cae29c9c107760859306e0bf661a6ddcea19f1e38718a22"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.353761 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f549dc79b-r6txc" event={"ID":"821bbb80-f2bb-4806-93d4-7c4e74a6c39e","Type":"ContainerStarted","Data":"fa666fdda0328f94228b8284b9c5853809a9c5c8d24615290e56a522a2d55a17"} Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.353927 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.353960 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.373903 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=19.373880282000002 podStartE2EDuration="19.373880282s" podCreationTimestamp="2025-12-04 17:57:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:22.37142521 +0000 UTC m=+1853.732499612" watchObservedRunningTime="2025-12-04 17:57:22.373880282 +0000 UTC m=+1853.734954684" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.403432 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b7b8cbd95-z6gmw" podStartSLOduration=20.403410211 podStartE2EDuration="20.403410211s" podCreationTimestamp="2025-12-04 17:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:22.396430186 +0000 UTC m=+1853.757504588" watchObservedRunningTime="2025-12-04 17:57:22.403410211 +0000 UTC m=+1853.764484613" Dec 04 17:57:22 crc kubenswrapper[4948]: I1204 17:57:22.416388 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f549dc79b-r6txc" podStartSLOduration=11.416371043 podStartE2EDuration="11.416371043s" podCreationTimestamp="2025-12-04 17:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:22.414502528 +0000 UTC m=+1853.775577010" watchObservedRunningTime="2025-12-04 17:57:22.416371043 +0000 UTC m=+1853.777445445" Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.364101 4948 generic.go:334] "Generic (PLEG): container finished" podID="8913b68d-4b7f-4a2e-b097-a60b0f557827" containerID="1d1c539929d00b4f50893be637194adb40ec4e3377a6f0bd73cc9431dafdb02f" exitCode=0 Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.364332 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7zdlz" event={"ID":"8913b68d-4b7f-4a2e-b097-a60b0f557827","Type":"ContainerDied","Data":"1d1c539929d00b4f50893be637194adb40ec4e3377a6f0bd73cc9431dafdb02f"} Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.367473 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e","Type":"ContainerStarted","Data":"8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff"} Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.369313 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nccrm" event={"ID":"2b81424a-68f9-40e6-bd32-a932a675578a","Type":"ContainerStarted","Data":"0d71e6bfecc189c1867608db7a4fa58effd809b7c670edfd46414b17cef466f3"} Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.372028 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f549dc79b-r6txc" event={"ID":"821bbb80-f2bb-4806-93d4-7c4e74a6c39e","Type":"ContainerStarted","Data":"74abda16ddeb18724719233b41a606846e167c1663b5d505e84c22f45c88a538"} Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.419526 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nccrm" podStartSLOduration=5.337236054 podStartE2EDuration="50.419508393s" podCreationTimestamp="2025-12-04 17:56:33 +0000 UTC" firstStartedPulling="2025-12-04 17:56:36.021763132 +0000 UTC m=+1807.382837534" lastFinishedPulling="2025-12-04 17:57:21.104035471 +0000 UTC m=+1852.465109873" observedRunningTime="2025-12-04 17:57:23.418340088 +0000 UTC m=+1854.779414490" watchObservedRunningTime="2025-12-04 17:57:23.419508393 +0000 UTC m=+1854.780582795" Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.629929 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.630562 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.669773 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:23 crc kubenswrapper[4948]: I1204 17:57:23.673462 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:24 crc kubenswrapper[4948]: I1204 17:57:24.384385 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:24 crc kubenswrapper[4948]: I1204 17:57:24.385112 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:24 crc kubenswrapper[4948]: I1204 17:57:24.418399 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.418380007 podStartE2EDuration="13.418380007s" podCreationTimestamp="2025-12-04 17:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:24.40964567 +0000 UTC m=+1855.770720072" watchObservedRunningTime="2025-12-04 17:57:24.418380007 +0000 UTC m=+1855.779454409" Dec 04 17:57:24 crc kubenswrapper[4948]: I1204 17:57:24.900426 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7zdlz" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.006146 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-scripts\") pod \"8913b68d-4b7f-4a2e-b097-a60b0f557827\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.006232 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-config-data\") pod \"8913b68d-4b7f-4a2e-b097-a60b0f557827\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.006298 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xphmb\" (UniqueName: \"kubernetes.io/projected/8913b68d-4b7f-4a2e-b097-a60b0f557827-kube-api-access-xphmb\") pod \"8913b68d-4b7f-4a2e-b097-a60b0f557827\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.006363 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-combined-ca-bundle\") pod \"8913b68d-4b7f-4a2e-b097-a60b0f557827\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.006413 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8913b68d-4b7f-4a2e-b097-a60b0f557827-logs\") pod \"8913b68d-4b7f-4a2e-b097-a60b0f557827\" (UID: \"8913b68d-4b7f-4a2e-b097-a60b0f557827\") " Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.025031 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8913b68d-4b7f-4a2e-b097-a60b0f557827-logs" (OuterVolumeSpecName: "logs") pod "8913b68d-4b7f-4a2e-b097-a60b0f557827" (UID: "8913b68d-4b7f-4a2e-b097-a60b0f557827"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.029262 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-scripts" (OuterVolumeSpecName: "scripts") pod "8913b68d-4b7f-4a2e-b097-a60b0f557827" (UID: "8913b68d-4b7f-4a2e-b097-a60b0f557827"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.029350 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8913b68d-4b7f-4a2e-b097-a60b0f557827-kube-api-access-xphmb" (OuterVolumeSpecName: "kube-api-access-xphmb") pod "8913b68d-4b7f-4a2e-b097-a60b0f557827" (UID: "8913b68d-4b7f-4a2e-b097-a60b0f557827"). InnerVolumeSpecName "kube-api-access-xphmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.063331 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-config-data" (OuterVolumeSpecName: "config-data") pod "8913b68d-4b7f-4a2e-b097-a60b0f557827" (UID: "8913b68d-4b7f-4a2e-b097-a60b0f557827"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.087616 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8913b68d-4b7f-4a2e-b097-a60b0f557827" (UID: "8913b68d-4b7f-4a2e-b097-a60b0f557827"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.122172 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.122207 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.122220 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xphmb\" (UniqueName: \"kubernetes.io/projected/8913b68d-4b7f-4a2e-b097-a60b0f557827-kube-api-access-xphmb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.122241 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8913b68d-4b7f-4a2e-b097-a60b0f557827-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.122251 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8913b68d-4b7f-4a2e-b097-a60b0f557827-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.390425 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7zdlz" event={"ID":"8913b68d-4b7f-4a2e-b097-a60b0f557827","Type":"ContainerDied","Data":"4e72858689350cb49327f8d3571ad203020ea4e4c59d5648551e8f8acdbcd85f"} Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.390478 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e72858689350cb49327f8d3571ad203020ea4e4c59d5648551e8f8acdbcd85f" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.390482 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7zdlz" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.396218 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" event={"ID":"c94e22e0-c0d1-4233-b21c-9860d204c068","Type":"ContainerStarted","Data":"f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186"} Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.396256 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" event={"ID":"c94e22e0-c0d1-4233-b21c-9860d204c068","Type":"ContainerStarted","Data":"ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b"} Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.410109 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" event={"ID":"7fa9312c-3146-4a5e-9db6-acc251aa60c6","Type":"ContainerStarted","Data":"a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4"} Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.410995 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.431517 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" podStartSLOduration=11.19040143 podStartE2EDuration="14.43149726s" podCreationTimestamp="2025-12-04 17:57:11 +0000 UTC" firstStartedPulling="2025-12-04 17:57:20.977332581 +0000 UTC m=+1852.338406983" lastFinishedPulling="2025-12-04 17:57:24.218428411 +0000 UTC m=+1855.579502813" observedRunningTime="2025-12-04 17:57:25.421963739 +0000 UTC m=+1856.783038141" watchObservedRunningTime="2025-12-04 17:57:25.43149726 +0000 UTC m=+1856.792571662" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.441745 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" event={"ID":"e905edc7-cd78-48c2-9192-fb18e1d193ac","Type":"ContainerStarted","Data":"ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c"} Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.441812 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" event={"ID":"e905edc7-cd78-48c2-9192-fb18e1d193ac","Type":"ContainerStarted","Data":"b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d"} Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.451243 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" podStartSLOduration=14.45119655 podStartE2EDuration="14.45119655s" podCreationTimestamp="2025-12-04 17:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:25.445532633 +0000 UTC m=+1856.806607035" watchObservedRunningTime="2025-12-04 17:57:25.45119655 +0000 UTC m=+1856.812270952" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.472075 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" podStartSLOduration=9.550650591 podStartE2EDuration="14.472053584s" podCreationTimestamp="2025-12-04 17:57:11 +0000 UTC" firstStartedPulling="2025-12-04 17:57:19.29438207 +0000 UTC m=+1850.655456472" lastFinishedPulling="2025-12-04 17:57:24.215785063 +0000 UTC m=+1855.576859465" observedRunningTime="2025-12-04 17:57:25.465531822 +0000 UTC m=+1856.826606224" watchObservedRunningTime="2025-12-04 17:57:25.472053584 +0000 UTC m=+1856.833127986" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.511101 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d56c8fbdd-fr7fc"] Dec 04 17:57:25 crc kubenswrapper[4948]: E1204 17:57:25.511695 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8913b68d-4b7f-4a2e-b097-a60b0f557827" containerName="placement-db-sync" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.511757 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8913b68d-4b7f-4a2e-b097-a60b0f557827" containerName="placement-db-sync" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.512019 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8913b68d-4b7f-4a2e-b097-a60b0f557827" containerName="placement-db-sync" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.512990 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.518525 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.518911 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.519128 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.519319 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rp6fj" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.519488 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.534805 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d56c8fbdd-fr7fc"] Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.630236 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c809e-76fd-458e-acbf-e2f6ce2d2f43-logs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.630293 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-internal-tls-certs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.630317 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-combined-ca-bundle\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.630338 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q28n9\" (UniqueName: \"kubernetes.io/projected/117c809e-76fd-458e-acbf-e2f6ce2d2f43-kube-api-access-q28n9\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.630412 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-config-data\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.630445 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-public-tls-certs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.630487 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-scripts\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.732019 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-scripts\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.732141 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c809e-76fd-458e-acbf-e2f6ce2d2f43-logs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.732168 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-internal-tls-certs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.732192 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-combined-ca-bundle\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.732220 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q28n9\" (UniqueName: \"kubernetes.io/projected/117c809e-76fd-458e-acbf-e2f6ce2d2f43-kube-api-access-q28n9\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.732325 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-config-data\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.732371 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-public-tls-certs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.732951 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c809e-76fd-458e-acbf-e2f6ce2d2f43-logs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.740611 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-public-tls-certs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.743362 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-combined-ca-bundle\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.743411 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-internal-tls-certs\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.743598 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-scripts\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.743915 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-config-data\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.748073 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q28n9\" (UniqueName: \"kubernetes.io/projected/117c809e-76fd-458e-acbf-e2f6ce2d2f43-kube-api-access-q28n9\") pod \"placement-d56c8fbdd-fr7fc\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:25 crc kubenswrapper[4948]: I1204 17:57:25.841789 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:26 crc kubenswrapper[4948]: I1204 17:57:26.523560 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d56c8fbdd-fr7fc"] Dec 04 17:57:26 crc kubenswrapper[4948]: W1204 17:57:26.534646 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod117c809e_76fd_458e_acbf_e2f6ce2d2f43.slice/crio-019d30a5e2404a89eb9f84fd9ea2bf3d3d9be0599889ea6459d70cfb030b8b03 WatchSource:0}: Error finding container 019d30a5e2404a89eb9f84fd9ea2bf3d3d9be0599889ea6459d70cfb030b8b03: Status 404 returned error can't find the container with id 019d30a5e2404a89eb9f84fd9ea2bf3d3d9be0599889ea6459d70cfb030b8b03 Dec 04 17:57:26 crc kubenswrapper[4948]: I1204 17:57:26.807547 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:26 crc kubenswrapper[4948]: I1204 17:57:26.855978 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 17:57:27 crc kubenswrapper[4948]: I1204 17:57:27.512888 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d56c8fbdd-fr7fc" event={"ID":"117c809e-76fd-458e-acbf-e2f6ce2d2f43","Type":"ContainerStarted","Data":"228923eb8a21101983b2bce76096ee36269db4118b9d4b8fa58a9ef47c3110a3"} Dec 04 17:57:27 crc kubenswrapper[4948]: I1204 17:57:27.513498 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d56c8fbdd-fr7fc" event={"ID":"117c809e-76fd-458e-acbf-e2f6ce2d2f43","Type":"ContainerStarted","Data":"648062ad89f6bf56de82ee3bd52951b86df3e900c322ee6b0c57f21f40ba73d8"} Dec 04 17:57:27 crc kubenswrapper[4948]: I1204 17:57:27.513572 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d56c8fbdd-fr7fc" event={"ID":"117c809e-76fd-458e-acbf-e2f6ce2d2f43","Type":"ContainerStarted","Data":"019d30a5e2404a89eb9f84fd9ea2bf3d3d9be0599889ea6459d70cfb030b8b03"} Dec 04 17:57:28 crc kubenswrapper[4948]: I1204 17:57:28.525594 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:28 crc kubenswrapper[4948]: I1204 17:57:28.525648 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:28 crc kubenswrapper[4948]: I1204 17:57:28.548016 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d56c8fbdd-fr7fc" podStartSLOduration=3.547999291 podStartE2EDuration="3.547999291s" podCreationTimestamp="2025-12-04 17:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:28.547662791 +0000 UTC m=+1859.908737203" watchObservedRunningTime="2025-12-04 17:57:28.547999291 +0000 UTC m=+1859.909073693" Dec 04 17:57:28 crc kubenswrapper[4948]: I1204 17:57:28.921715 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:57:28 crc kubenswrapper[4948]: E1204 17:57:28.925353 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:57:29 crc kubenswrapper[4948]: I1204 17:57:29.535246 4948 generic.go:334] "Generic (PLEG): container finished" podID="2b81424a-68f9-40e6-bd32-a932a675578a" containerID="0d71e6bfecc189c1867608db7a4fa58effd809b7c670edfd46414b17cef466f3" exitCode=0 Dec 04 17:57:29 crc kubenswrapper[4948]: I1204 17:57:29.535989 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nccrm" event={"ID":"2b81424a-68f9-40e6-bd32-a932a675578a","Type":"ContainerDied","Data":"0d71e6bfecc189c1867608db7a4fa58effd809b7c670edfd46414b17cef466f3"} Dec 04 17:57:29 crc kubenswrapper[4948]: I1204 17:57:29.566291 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:29 crc kubenswrapper[4948]: I1204 17:57:29.714654 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:30 crc kubenswrapper[4948]: I1204 17:57:30.393493 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.091698 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.394232 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.394657 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.413661 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.453020 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.478225 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5f549dc79b-r6txc"] Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.496673 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.569951 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.570123 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5f549dc79b-r6txc" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api-log" containerID="cri-o://6883ce38c46d69955cae29c9c107760859306e0bf661a6ddcea19f1e38718a22" gracePeriod=30 Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.570306 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5f549dc79b-r6txc" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api" containerID="cri-o://74abda16ddeb18724719233b41a606846e167c1663b5d505e84c22f45c88a538" gracePeriod=30 Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.570435 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.686239 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5f549dc79b-r6txc" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Dec 04 17:57:31 crc kubenswrapper[4948]: I1204 17:57:31.716548 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5f549dc79b-r6txc" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.203289 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.259783 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-p84jl"] Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.260076 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" podUID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerName="dnsmasq-dns" containerID="cri-o://b55807d6a1c99dcbcfb7a907c49525bc4b6ce1f050dd6a8f6afa9fe3a1b59140" gracePeriod=10 Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.592558 4948 generic.go:334] "Generic (PLEG): container finished" podID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerID="6883ce38c46d69955cae29c9c107760859306e0bf661a6ddcea19f1e38718a22" exitCode=143 Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.592635 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f549dc79b-r6txc" event={"ID":"821bbb80-f2bb-4806-93d4-7c4e74a6c39e","Type":"ContainerDied","Data":"6883ce38c46d69955cae29c9c107760859306e0bf661a6ddcea19f1e38718a22"} Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.595665 4948 generic.go:334] "Generic (PLEG): container finished" podID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerID="b55807d6a1c99dcbcfb7a907c49525bc4b6ce1f050dd6a8f6afa9fe3a1b59140" exitCode=0 Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.596076 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" event={"ID":"9ffb4d75-52dc-4cf7-90a9-7577b5dea591","Type":"ContainerDied","Data":"b55807d6a1c99dcbcfb7a907c49525bc4b6ce1f050dd6a8f6afa9fe3a1b59140"} Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.728003 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.784609 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-759ffd8674-fjwkq"] Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.784807 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-759ffd8674-fjwkq" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerName="neutron-api" containerID="cri-o://8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e" gracePeriod=30 Dec 04 17:57:32 crc kubenswrapper[4948]: I1204 17:57:32.808346 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-759ffd8674-fjwkq" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerName="neutron-httpd" containerID="cri-o://1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f" gracePeriod=30 Dec 04 17:57:33 crc kubenswrapper[4948]: I1204 17:57:33.606663 4948 generic.go:334] "Generic (PLEG): container finished" podID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerID="1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f" exitCode=0 Dec 04 17:57:33 crc kubenswrapper[4948]: I1204 17:57:33.606734 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-759ffd8674-fjwkq" event={"ID":"66ea68ed-1808-416c-b0e8-a2682d3d3b1f","Type":"ContainerDied","Data":"1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f"} Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.037540 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nccrm" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.131793 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-config-data\") pod \"2b81424a-68f9-40e6-bd32-a932a675578a\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.131876 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b81424a-68f9-40e6-bd32-a932a675578a-etc-machine-id\") pod \"2b81424a-68f9-40e6-bd32-a932a675578a\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.131941 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-scripts\") pod \"2b81424a-68f9-40e6-bd32-a932a675578a\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.132113 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkl2m\" (UniqueName: \"kubernetes.io/projected/2b81424a-68f9-40e6-bd32-a932a675578a-kube-api-access-kkl2m\") pod \"2b81424a-68f9-40e6-bd32-a932a675578a\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.132159 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-db-sync-config-data\") pod \"2b81424a-68f9-40e6-bd32-a932a675578a\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.132209 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-combined-ca-bundle\") pod \"2b81424a-68f9-40e6-bd32-a932a675578a\" (UID: \"2b81424a-68f9-40e6-bd32-a932a675578a\") " Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.133857 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b81424a-68f9-40e6-bd32-a932a675578a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2b81424a-68f9-40e6-bd32-a932a675578a" (UID: "2b81424a-68f9-40e6-bd32-a932a675578a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.144747 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-scripts" (OuterVolumeSpecName: "scripts") pod "2b81424a-68f9-40e6-bd32-a932a675578a" (UID: "2b81424a-68f9-40e6-bd32-a932a675578a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.144974 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2b81424a-68f9-40e6-bd32-a932a675578a" (UID: "2b81424a-68f9-40e6-bd32-a932a675578a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.151828 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b81424a-68f9-40e6-bd32-a932a675578a-kube-api-access-kkl2m" (OuterVolumeSpecName: "kube-api-access-kkl2m") pod "2b81424a-68f9-40e6-bd32-a932a675578a" (UID: "2b81424a-68f9-40e6-bd32-a932a675578a"). InnerVolumeSpecName "kube-api-access-kkl2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.189981 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b81424a-68f9-40e6-bd32-a932a675578a" (UID: "2b81424a-68f9-40e6-bd32-a932a675578a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.201558 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-config-data" (OuterVolumeSpecName: "config-data") pod "2b81424a-68f9-40e6-bd32-a932a675578a" (UID: "2b81424a-68f9-40e6-bd32-a932a675578a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.235512 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.235559 4948 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b81424a-68f9-40e6-bd32-a932a675578a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.235572 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.235585 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkl2m\" (UniqueName: \"kubernetes.io/projected/2b81424a-68f9-40e6-bd32-a932a675578a-kube-api-access-kkl2m\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.235597 4948 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.235609 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b81424a-68f9-40e6-bd32-a932a675578a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.333777 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.333922 4948 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.335379 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.633651 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nccrm" event={"ID":"2b81424a-68f9-40e6-bd32-a932a675578a","Type":"ContainerDied","Data":"2e57988cfcced286f7ab9347d4eb01560da850e84a77ada315a49b4e465b093b"} Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.633703 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e57988cfcced286f7ab9347d4eb01560da850e84a77ada315a49b4e465b093b" Dec 04 17:57:34 crc kubenswrapper[4948]: I1204 17:57:34.633681 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nccrm" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.391100 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:35 crc kubenswrapper[4948]: E1204 17:57:35.391858 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b81424a-68f9-40e6-bd32-a932a675578a" containerName="cinder-db-sync" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.391896 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b81424a-68f9-40e6-bd32-a932a675578a" containerName="cinder-db-sync" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.392130 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b81424a-68f9-40e6-bd32-a932a675578a" containerName="cinder-db-sync" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.393280 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.405317 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2q66q" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.405602 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.405719 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.405833 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.412214 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.458028 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-scripts\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.458125 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlx8s\" (UniqueName: \"kubernetes.io/projected/d4b458e9-976a-4599-8b47-9f9c368eff65-kube-api-access-hlx8s\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.458245 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4b458e9-976a-4599-8b47-9f9c368eff65-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.458283 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.458304 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.458340 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.503617 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-r8vh6"] Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.512575 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.537867 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-r8vh6"] Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.566443 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.566490 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h4dl\" (UniqueName: \"kubernetes.io/projected/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-kube-api-access-7h4dl\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.566550 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.566608 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.566642 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-svc\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.566836 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4b458e9-976a-4599-8b47-9f9c368eff65-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.566932 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.566990 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.567132 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.567248 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-scripts\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.567317 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlx8s\" (UniqueName: \"kubernetes.io/projected/d4b458e9-976a-4599-8b47-9f9c368eff65-kube-api-access-hlx8s\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.567377 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-config\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.567532 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4b458e9-976a-4599-8b47-9f9c368eff65-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.575153 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-scripts\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.575410 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.579762 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.585908 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.604965 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlx8s\" (UniqueName: \"kubernetes.io/projected/d4b458e9-976a-4599-8b47-9f9c368eff65-kube-api-access-hlx8s\") pod \"cinder-scheduler-0\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.668803 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-config\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.669176 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.669211 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h4dl\" (UniqueName: \"kubernetes.io/projected/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-kube-api-access-7h4dl\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.669238 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.669284 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.669306 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-svc\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.670510 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-svc\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.671122 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-config\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.671266 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.671734 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.672212 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.709018 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.711295 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.720021 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h4dl\" (UniqueName: \"kubernetes.io/projected/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-kube-api-access-7h4dl\") pod \"dnsmasq-dns-6578955fd5-r8vh6\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.720300 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.747937 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.774947 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9f536c80-2f19-4b52-bbce-9e6612afb36c-kube-api-access-dtmsn\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.774992 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.775075 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f536c80-2f19-4b52-bbce-9e6612afb36c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.775096 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.775113 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.775218 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-scripts\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.775270 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f536c80-2f19-4b52-bbce-9e6612afb36c-logs\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.788486 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.877932 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-scripts\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.878022 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f536c80-2f19-4b52-bbce-9e6612afb36c-logs\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.878072 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9f536c80-2f19-4b52-bbce-9e6612afb36c-kube-api-access-dtmsn\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.878095 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.878142 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f536c80-2f19-4b52-bbce-9e6612afb36c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.878161 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.878176 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.881273 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f536c80-2f19-4b52-bbce-9e6612afb36c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.881844 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f536c80-2f19-4b52-bbce-9e6612afb36c-logs\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.885634 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-scripts\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.886489 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.891169 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.898521 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9f536c80-2f19-4b52-bbce-9e6612afb36c-kube-api-access-dtmsn\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.899281 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " pod="openstack/cinder-api-0" Dec 04 17:57:35 crc kubenswrapper[4948]: I1204 17:57:35.986755 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.063131 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.188226 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f549dc79b-r6txc" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:38764->10.217.0.157:9311: read: connection reset by peer" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.188242 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5f549dc79b-r6txc" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:38772->10.217.0.157:9311: read: connection reset by peer" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.424881 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.505572 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-swift-storage-0\") pod \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.505654 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-nb\") pod \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.505709 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-svc\") pod \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.508286 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-sb\") pod \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.508391 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-config\") pod \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.508621 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9ggw\" (UniqueName: \"kubernetes.io/projected/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-kube-api-access-v9ggw\") pod \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\" (UID: \"9ffb4d75-52dc-4cf7-90a9-7577b5dea591\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.516350 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-kube-api-access-v9ggw" (OuterVolumeSpecName: "kube-api-access-v9ggw") pod "9ffb4d75-52dc-4cf7-90a9-7577b5dea591" (UID: "9ffb4d75-52dc-4cf7-90a9-7577b5dea591"). InnerVolumeSpecName "kube-api-access-v9ggw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.590599 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ffb4d75-52dc-4cf7-90a9-7577b5dea591" (UID: "9ffb4d75-52dc-4cf7-90a9-7577b5dea591"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.590550 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ffb4d75-52dc-4cf7-90a9-7577b5dea591" (UID: "9ffb4d75-52dc-4cf7-90a9-7577b5dea591"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.600341 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-config" (OuterVolumeSpecName: "config") pod "9ffb4d75-52dc-4cf7-90a9-7577b5dea591" (UID: "9ffb4d75-52dc-4cf7-90a9-7577b5dea591"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.610472 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.610498 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9ggw\" (UniqueName: \"kubernetes.io/projected/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-kube-api-access-v9ggw\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.610510 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.610518 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.614739 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ffb4d75-52dc-4cf7-90a9-7577b5dea591" (UID: "9ffb4d75-52dc-4cf7-90a9-7577b5dea591"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.615232 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ffb4d75-52dc-4cf7-90a9-7577b5dea591" (UID: "9ffb4d75-52dc-4cf7-90a9-7577b5dea591"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.689587 4948 generic.go:334] "Generic (PLEG): container finished" podID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerID="74abda16ddeb18724719233b41a606846e167c1663b5d505e84c22f45c88a538" exitCode=0 Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.689664 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f549dc79b-r6txc" event={"ID":"821bbb80-f2bb-4806-93d4-7c4e74a6c39e","Type":"ContainerDied","Data":"74abda16ddeb18724719233b41a606846e167c1663b5d505e84c22f45c88a538"} Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.691822 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" event={"ID":"9ffb4d75-52dc-4cf7-90a9-7577b5dea591","Type":"ContainerDied","Data":"63870ffa76e8a629eeb58392a183657894ad80203bed60421a58a4c7a0e2e4b6"} Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.691871 4948 scope.go:117] "RemoveContainer" containerID="b55807d6a1c99dcbcfb7a907c49525bc4b6ce1f050dd6a8f6afa9fe3a1b59140" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.692025 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.712102 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.712327 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ffb4d75-52dc-4cf7-90a9-7577b5dea591-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.750678 4948 scope.go:117] "RemoveContainer" containerID="34d55ac9e6ea4ab06758c3e1d7eb7a3ceeb54e98b523e9ae77ca0a5cf205fa8a" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.775763 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-p84jl"] Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.786100 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-p84jl"] Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.843974 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.916025 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-combined-ca-bundle\") pod \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.916098 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwfcw\" (UniqueName: \"kubernetes.io/projected/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-kube-api-access-vwfcw\") pod \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.916201 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-logs\") pod \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.916224 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data\") pod \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.916591 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data-custom\") pod \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\" (UID: \"821bbb80-f2bb-4806-93d4-7c4e74a6c39e\") " Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.917687 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-logs" (OuterVolumeSpecName: "logs") pod "821bbb80-f2bb-4806-93d4-7c4e74a6c39e" (UID: "821bbb80-f2bb-4806-93d4-7c4e74a6c39e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.921164 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "821bbb80-f2bb-4806-93d4-7c4e74a6c39e" (UID: "821bbb80-f2bb-4806-93d4-7c4e74a6c39e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.928158 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-kube-api-access-vwfcw" (OuterVolumeSpecName: "kube-api-access-vwfcw") pod "821bbb80-f2bb-4806-93d4-7c4e74a6c39e" (UID: "821bbb80-f2bb-4806-93d4-7c4e74a6c39e"). InnerVolumeSpecName "kube-api-access-vwfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.950283 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "821bbb80-f2bb-4806-93d4-7c4e74a6c39e" (UID: "821bbb80-f2bb-4806-93d4-7c4e74a6c39e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.950724 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" path="/var/lib/kubelet/pods/9ffb4d75-52dc-4cf7-90a9-7577b5dea591/volumes" Dec 04 17:57:36 crc kubenswrapper[4948]: I1204 17:57:36.985383 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data" (OuterVolumeSpecName: "config-data") pod "821bbb80-f2bb-4806-93d4-7c4e74a6c39e" (UID: "821bbb80-f2bb-4806-93d4-7c4e74a6c39e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.018497 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.018532 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.018542 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwfcw\" (UniqueName: \"kubernetes.io/projected/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-kube-api-access-vwfcw\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.018552 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.018561 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/821bbb80-f2bb-4806-93d4-7c4e74a6c39e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.045349 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-r8vh6"] Dec 04 17:57:37 crc kubenswrapper[4948]: W1204 17:57:37.052838 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8b989d7_73b9_4e45_b5fc_ddc77fa81e6e.slice/crio-c5603b44452cce4a343f9a17b128acc762e84138c0d01ea722208f869765ecc7 WatchSource:0}: Error finding container c5603b44452cce4a343f9a17b128acc762e84138c0d01ea722208f869765ecc7: Status 404 returned error can't find the container with id c5603b44452cce4a343f9a17b128acc762e84138c0d01ea722208f869765ecc7 Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.057592 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.277963 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:37 crc kubenswrapper[4948]: W1204 17:57:37.283732 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f536c80_2f19_4b52_bbce_9e6612afb36c.slice/crio-e4f62d46a5d20e3c57f79865ff479cc8c1fe19b99594b3f7b64467f2ce1fa7b2 WatchSource:0}: Error finding container e4f62d46a5d20e3c57f79865ff479cc8c1fe19b99594b3f7b64467f2ce1fa7b2: Status 404 returned error can't find the container with id e4f62d46a5d20e3c57f79865ff479cc8c1fe19b99594b3f7b64467f2ce1fa7b2 Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.701223 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f536c80-2f19-4b52-bbce-9e6612afb36c","Type":"ContainerStarted","Data":"e4f62d46a5d20e3c57f79865ff479cc8c1fe19b99594b3f7b64467f2ce1fa7b2"} Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.704076 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerStarted","Data":"64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1"} Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.704150 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.704161 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="ceilometer-central-agent" containerID="cri-o://5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db" gracePeriod=30 Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.704192 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="ceilometer-notification-agent" containerID="cri-o://a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af" gracePeriod=30 Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.704216 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="proxy-httpd" containerID="cri-o://64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1" gracePeriod=30 Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.704323 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="sg-core" containerID="cri-o://b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef" gracePeriod=30 Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.707894 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d4b458e9-976a-4599-8b47-9f9c368eff65","Type":"ContainerStarted","Data":"bb058617ba8e96267722ab0dc226ff5275d3f7610a8d32e0e13c1a7bd7e92c02"} Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.710430 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f549dc79b-r6txc" event={"ID":"821bbb80-f2bb-4806-93d4-7c4e74a6c39e","Type":"ContainerDied","Data":"fa666fdda0328f94228b8284b9c5853809a9c5c8d24615290e56a522a2d55a17"} Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.710476 4948 scope.go:117] "RemoveContainer" containerID="74abda16ddeb18724719233b41a606846e167c1663b5d505e84c22f45c88a538" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.710671 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f549dc79b-r6txc" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.720438 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" event={"ID":"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e","Type":"ContainerStarted","Data":"a420791b3826131acb476eb5c6cf776be12737be74e1462171b95730c8dcf98a"} Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.720496 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" event={"ID":"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e","Type":"ContainerStarted","Data":"c5603b44452cce4a343f9a17b128acc762e84138c0d01ea722208f869765ecc7"} Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.731373 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.428458525 podStartE2EDuration="1m4.731354845s" podCreationTimestamp="2025-12-04 17:56:33 +0000 UTC" firstStartedPulling="2025-12-04 17:56:35.16018637 +0000 UTC m=+1806.521260772" lastFinishedPulling="2025-12-04 17:57:36.46308269 +0000 UTC m=+1867.824157092" observedRunningTime="2025-12-04 17:57:37.72542471 +0000 UTC m=+1869.086499112" watchObservedRunningTime="2025-12-04 17:57:37.731354845 +0000 UTC m=+1869.092429247" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.749664 4948 scope.go:117] "RemoveContainer" containerID="6883ce38c46d69955cae29c9c107760859306e0bf661a6ddcea19f1e38718a22" Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.758587 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5f549dc79b-r6txc"] Dec 04 17:57:37 crc kubenswrapper[4948]: I1204 17:57:37.766859 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5f549dc79b-r6txc"] Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.120341 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.738875 4948 generic.go:334] "Generic (PLEG): container finished" podID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerID="64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1" exitCode=0 Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.739148 4948 generic.go:334] "Generic (PLEG): container finished" podID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerID="b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef" exitCode=2 Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.739159 4948 generic.go:334] "Generic (PLEG): container finished" podID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerID="5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db" exitCode=0 Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.738958 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerDied","Data":"64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1"} Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.739251 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerDied","Data":"b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef"} Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.739278 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerDied","Data":"5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db"} Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.742504 4948 generic.go:334] "Generic (PLEG): container finished" podID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" containerID="a420791b3826131acb476eb5c6cf776be12737be74e1462171b95730c8dcf98a" exitCode=0 Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.742541 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" event={"ID":"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e","Type":"ContainerDied","Data":"a420791b3826131acb476eb5c6cf776be12737be74e1462171b95730c8dcf98a"} Dec 04 17:57:38 crc kubenswrapper[4948]: I1204 17:57:38.927746 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" path="/var/lib/kubelet/pods/821bbb80-f2bb-4806-93d4-7c4e74a6c39e/volumes" Dec 04 17:57:39 crc kubenswrapper[4948]: I1204 17:57:39.758652 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" event={"ID":"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e","Type":"ContainerStarted","Data":"f2f5428c8fb22126200466715bb9c4edf354bd2a71a3ce55d28b33cc742b21d9"} Dec 04 17:57:39 crc kubenswrapper[4948]: I1204 17:57:39.759081 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:39 crc kubenswrapper[4948]: I1204 17:57:39.762155 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f536c80-2f19-4b52-bbce-9e6612afb36c","Type":"ContainerStarted","Data":"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488"} Dec 04 17:57:39 crc kubenswrapper[4948]: I1204 17:57:39.787603 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" podStartSLOduration=4.787583904 podStartE2EDuration="4.787583904s" podCreationTimestamp="2025-12-04 17:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:39.780825625 +0000 UTC m=+1871.141900027" watchObservedRunningTime="2025-12-04 17:57:39.787583904 +0000 UTC m=+1871.148658306" Dec 04 17:57:39 crc kubenswrapper[4948]: I1204 17:57:39.913583 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:57:39 crc kubenswrapper[4948]: E1204 17:57:39.914122 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.354824 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.384479 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-config\") pod \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.384582 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-ovndb-tls-certs\") pod \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.384603 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhwrk\" (UniqueName: \"kubernetes.io/projected/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-kube-api-access-vhwrk\") pod \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.384642 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-combined-ca-bundle\") pod \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.384665 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-httpd-config\") pod \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.405281 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-kube-api-access-vhwrk" (OuterVolumeSpecName: "kube-api-access-vhwrk") pod "66ea68ed-1808-416c-b0e8-a2682d3d3b1f" (UID: "66ea68ed-1808-416c-b0e8-a2682d3d3b1f"). InnerVolumeSpecName "kube-api-access-vhwrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.406431 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "66ea68ed-1808-416c-b0e8-a2682d3d3b1f" (UID: "66ea68ed-1808-416c-b0e8-a2682d3d3b1f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.465495 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66ea68ed-1808-416c-b0e8-a2682d3d3b1f" (UID: "66ea68ed-1808-416c-b0e8-a2682d3d3b1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.467285 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-config" (OuterVolumeSpecName: "config") pod "66ea68ed-1808-416c-b0e8-a2682d3d3b1f" (UID: "66ea68ed-1808-416c-b0e8-a2682d3d3b1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.485338 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "66ea68ed-1808-416c-b0e8-a2682d3d3b1f" (UID: "66ea68ed-1808-416c-b0e8-a2682d3d3b1f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.485717 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-ovndb-tls-certs\") pod \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\" (UID: \"66ea68ed-1808-416c-b0e8-a2682d3d3b1f\") " Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.486103 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.486129 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhwrk\" (UniqueName: \"kubernetes.io/projected/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-kube-api-access-vhwrk\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.486144 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.486156 4948 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:40 crc kubenswrapper[4948]: W1204 17:57:40.486252 4948 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/66ea68ed-1808-416c-b0e8-a2682d3d3b1f/volumes/kubernetes.io~secret/ovndb-tls-certs Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.486263 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "66ea68ed-1808-416c-b0e8-a2682d3d3b1f" (UID: "66ea68ed-1808-416c-b0e8-a2682d3d3b1f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.587704 4948 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/66ea68ed-1808-416c-b0e8-a2682d3d3b1f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.773697 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d4b458e9-976a-4599-8b47-9f9c368eff65","Type":"ContainerStarted","Data":"cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551"} Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.776202 4948 generic.go:334] "Generic (PLEG): container finished" podID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerID="8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e" exitCode=0 Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.776263 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-759ffd8674-fjwkq" event={"ID":"66ea68ed-1808-416c-b0e8-a2682d3d3b1f","Type":"ContainerDied","Data":"8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e"} Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.776273 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-759ffd8674-fjwkq" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.776286 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-759ffd8674-fjwkq" event={"ID":"66ea68ed-1808-416c-b0e8-a2682d3d3b1f","Type":"ContainerDied","Data":"ac45923e9c7627031a4f5ad1d74f087ca83e61cb8fccd12c17c502a87d1b1fe4"} Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.776308 4948 scope.go:117] "RemoveContainer" containerID="1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.780790 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f536c80-2f19-4b52-bbce-9e6612afb36c","Type":"ContainerStarted","Data":"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751"} Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.780874 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerName="cinder-api-log" containerID="cri-o://db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488" gracePeriod=30 Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.780949 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerName="cinder-api" containerID="cri-o://1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751" gracePeriod=30 Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.781164 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.813977 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.813955558 podStartE2EDuration="5.813955558s" podCreationTimestamp="2025-12-04 17:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:40.804887161 +0000 UTC m=+1872.165961563" watchObservedRunningTime="2025-12-04 17:57:40.813955558 +0000 UTC m=+1872.175029960" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.824423 4948 scope.go:117] "RemoveContainer" containerID="8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.831105 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-759ffd8674-fjwkq"] Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.840836 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-759ffd8674-fjwkq"] Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.844655 4948 scope.go:117] "RemoveContainer" containerID="1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f" Dec 04 17:57:40 crc kubenswrapper[4948]: E1204 17:57:40.845095 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f\": container with ID starting with 1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f not found: ID does not exist" containerID="1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.845136 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f"} err="failed to get container status \"1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f\": rpc error: code = NotFound desc = could not find container \"1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f\": container with ID starting with 1be6f78ec5d04e74dd92c2cec1734b44c15b1af7238f3554569b98f46ac8f21f not found: ID does not exist" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.845164 4948 scope.go:117] "RemoveContainer" containerID="8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e" Dec 04 17:57:40 crc kubenswrapper[4948]: E1204 17:57:40.845637 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e\": container with ID starting with 8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e not found: ID does not exist" containerID="8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.845682 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e"} err="failed to get container status \"8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e\": rpc error: code = NotFound desc = could not find container \"8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e\": container with ID starting with 8c08e4556639bcce85e8bbc0d269cf96c14c6433c491e161cd09ed0b9164228e not found: ID does not exist" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.914447 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c79d794d7-p84jl" podUID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: i/o timeout" Dec 04 17:57:40 crc kubenswrapper[4948]: I1204 17:57:40.923636 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" path="/var/lib/kubelet/pods/66ea68ed-1808-416c-b0e8-a2682d3d3b1f/volumes" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.256963 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.408084 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data-custom\") pod \"9f536c80-2f19-4b52-bbce-9e6612afb36c\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.408211 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-scripts\") pod \"9f536c80-2f19-4b52-bbce-9e6612afb36c\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.408264 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f536c80-2f19-4b52-bbce-9e6612afb36c-etc-machine-id\") pod \"9f536c80-2f19-4b52-bbce-9e6612afb36c\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.408533 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-combined-ca-bundle\") pod \"9f536c80-2f19-4b52-bbce-9e6612afb36c\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.408671 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9f536c80-2f19-4b52-bbce-9e6612afb36c-kube-api-access-dtmsn\") pod \"9f536c80-2f19-4b52-bbce-9e6612afb36c\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.408712 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f536c80-2f19-4b52-bbce-9e6612afb36c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9f536c80-2f19-4b52-bbce-9e6612afb36c" (UID: "9f536c80-2f19-4b52-bbce-9e6612afb36c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.408735 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f536c80-2f19-4b52-bbce-9e6612afb36c-logs\") pod \"9f536c80-2f19-4b52-bbce-9e6612afb36c\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.408825 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data\") pod \"9f536c80-2f19-4b52-bbce-9e6612afb36c\" (UID: \"9f536c80-2f19-4b52-bbce-9e6612afb36c\") " Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.409630 4948 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f536c80-2f19-4b52-bbce-9e6612afb36c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.409613 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f536c80-2f19-4b52-bbce-9e6612afb36c-logs" (OuterVolumeSpecName: "logs") pod "9f536c80-2f19-4b52-bbce-9e6612afb36c" (UID: "9f536c80-2f19-4b52-bbce-9e6612afb36c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.414683 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9f536c80-2f19-4b52-bbce-9e6612afb36c" (UID: "9f536c80-2f19-4b52-bbce-9e6612afb36c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.414726 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-scripts" (OuterVolumeSpecName: "scripts") pod "9f536c80-2f19-4b52-bbce-9e6612afb36c" (UID: "9f536c80-2f19-4b52-bbce-9e6612afb36c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.415532 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f536c80-2f19-4b52-bbce-9e6612afb36c-kube-api-access-dtmsn" (OuterVolumeSpecName: "kube-api-access-dtmsn") pod "9f536c80-2f19-4b52-bbce-9e6612afb36c" (UID: "9f536c80-2f19-4b52-bbce-9e6612afb36c"). InnerVolumeSpecName "kube-api-access-dtmsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.451517 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f536c80-2f19-4b52-bbce-9e6612afb36c" (UID: "9f536c80-2f19-4b52-bbce-9e6612afb36c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.466103 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data" (OuterVolumeSpecName: "config-data") pod "9f536c80-2f19-4b52-bbce-9e6612afb36c" (UID: "9f536c80-2f19-4b52-bbce-9e6612afb36c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.511273 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.511311 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9f536c80-2f19-4b52-bbce-9e6612afb36c-kube-api-access-dtmsn\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.511329 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f536c80-2f19-4b52-bbce-9e6612afb36c-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.511346 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.511364 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.511380 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f536c80-2f19-4b52-bbce-9e6612afb36c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.799209 4948 generic.go:334] "Generic (PLEG): container finished" podID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerID="1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751" exitCode=0 Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.799252 4948 generic.go:334] "Generic (PLEG): container finished" podID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerID="db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488" exitCode=143 Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.799276 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.799311 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f536c80-2f19-4b52-bbce-9e6612afb36c","Type":"ContainerDied","Data":"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751"} Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.799354 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f536c80-2f19-4b52-bbce-9e6612afb36c","Type":"ContainerDied","Data":"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488"} Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.799373 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f536c80-2f19-4b52-bbce-9e6612afb36c","Type":"ContainerDied","Data":"e4f62d46a5d20e3c57f79865ff479cc8c1fe19b99594b3f7b64467f2ce1fa7b2"} Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.799394 4948 scope.go:117] "RemoveContainer" containerID="1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.803536 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d4b458e9-976a-4599-8b47-9f9c368eff65","Type":"ContainerStarted","Data":"dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995"} Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.830348 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.884606894 podStartE2EDuration="6.830328597s" podCreationTimestamp="2025-12-04 17:57:35 +0000 UTC" firstStartedPulling="2025-12-04 17:57:37.064418402 +0000 UTC m=+1868.425492804" lastFinishedPulling="2025-12-04 17:57:40.010140105 +0000 UTC m=+1871.371214507" observedRunningTime="2025-12-04 17:57:41.828966017 +0000 UTC m=+1873.190040499" watchObservedRunningTime="2025-12-04 17:57:41.830328597 +0000 UTC m=+1873.191402999" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.842931 4948 scope.go:117] "RemoveContainer" containerID="db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.859659 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.872650 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.883788 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.884275 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api-log" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884290 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api-log" Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.884316 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerName="neutron-api" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884325 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerName="neutron-api" Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.884335 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerName="init" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884343 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerName="init" Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.884360 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerName="neutron-httpd" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884369 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerName="neutron-httpd" Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.884390 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerName="dnsmasq-dns" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884398 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerName="dnsmasq-dns" Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.884419 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerName="cinder-api-log" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884427 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerName="cinder-api-log" Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.884440 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884449 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api" Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.884472 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerName="cinder-api" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884479 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerName="cinder-api" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884722 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerName="neutron-api" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884744 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffb4d75-52dc-4cf7-90a9-7577b5dea591" containerName="dnsmasq-dns" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884755 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerName="cinder-api-log" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884793 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" containerName="cinder-api" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884815 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ea68ed-1808-416c-b0e8-a2682d3d3b1f" containerName="neutron-httpd" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884829 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.884847 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="821bbb80-f2bb-4806-93d4-7c4e74a6c39e" containerName="barbican-api-log" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.886074 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.889716 4948 scope.go:117] "RemoveContainer" containerID="1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751" Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.890926 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751\": container with ID starting with 1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751 not found: ID does not exist" containerID="1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.890979 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751"} err="failed to get container status \"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751\": rpc error: code = NotFound desc = could not find container \"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751\": container with ID starting with 1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751 not found: ID does not exist" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.891010 4948 scope.go:117] "RemoveContainer" containerID="db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.891599 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.891721 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.899181 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:41 crc kubenswrapper[4948]: E1204 17:57:41.899927 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488\": container with ID starting with db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488 not found: ID does not exist" containerID="db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.899965 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488"} err="failed to get container status \"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488\": rpc error: code = NotFound desc = could not find container \"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488\": container with ID starting with db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488 not found: ID does not exist" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.899994 4948 scope.go:117] "RemoveContainer" containerID="1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.901065 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751"} err="failed to get container status \"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751\": rpc error: code = NotFound desc = could not find container \"1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751\": container with ID starting with 1dc1875da2948f4712493fed7423e8e8d336a1b1958a1d826e9ea26fc9c82751 not found: ID does not exist" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.901119 4948 scope.go:117] "RemoveContainer" containerID="db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.901680 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488"} err="failed to get container status \"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488\": rpc error: code = NotFound desc = could not find container \"db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488\": container with ID starting with db24a3740f5c9e125032bb393e78f232670ca784dac07cdd79d4ce394c16e488 not found: ID does not exist" Dec 04 17:57:41 crc kubenswrapper[4948]: I1204 17:57:41.901942 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025099 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025409 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025432 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025479 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pwcc\" (UniqueName: \"kubernetes.io/projected/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-kube-api-access-8pwcc\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025520 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025545 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-logs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025576 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-scripts\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025625 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.025702 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data-custom\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.127748 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.127832 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.127865 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.127919 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pwcc\" (UniqueName: \"kubernetes.io/projected/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-kube-api-access-8pwcc\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.127965 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.128059 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-logs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.128100 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-scripts\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.128142 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.128195 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data-custom\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.129550 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.130892 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-logs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.132606 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.132730 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.133859 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-scripts\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.135426 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data-custom\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.139410 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.148023 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.152322 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pwcc\" (UniqueName: \"kubernetes.io/projected/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-kube-api-access-8pwcc\") pod \"cinder-api-0\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.218667 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.672964 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 17:57:42 crc kubenswrapper[4948]: W1204 17:57:42.681412 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfdde2fd_5c98_4b6f_b9a5_a746a454fafd.slice/crio-fb6b9e12c3fdd67fa4143d373ae90ebd6b9aafd6949f02e8e590d78df808ea76 WatchSource:0}: Error finding container fb6b9e12c3fdd67fa4143d373ae90ebd6b9aafd6949f02e8e590d78df808ea76: Status 404 returned error can't find the container with id fb6b9e12c3fdd67fa4143d373ae90ebd6b9aafd6949f02e8e590d78df808ea76 Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.815683 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd","Type":"ContainerStarted","Data":"fb6b9e12c3fdd67fa4143d373ae90ebd6b9aafd6949f02e8e590d78df808ea76"} Dec 04 17:57:42 crc kubenswrapper[4948]: I1204 17:57:42.928071 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f536c80-2f19-4b52-bbce-9e6612afb36c" path="/var/lib/kubelet/pods/9f536c80-2f19-4b52-bbce-9e6612afb36c/volumes" Dec 04 17:57:43 crc kubenswrapper[4948]: I1204 17:57:43.843257 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd","Type":"ContainerStarted","Data":"9bcf71c31ca1e73a965969fb92ddba24201c5c11e6c3cb3b4e92e77ecdd4bf87"} Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.738273 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.854294 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd","Type":"ContainerStarted","Data":"637497b65838d4e1875162878d30bf8895cfbdd36b9fd9f4596de491cb8f3761"} Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.854796 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.858743 4948 generic.go:334] "Generic (PLEG): container finished" podID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerID="a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af" exitCode=0 Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.858781 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerDied","Data":"a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af"} Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.858804 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d845ad24-e30a-41e2-8a0b-6812b49b91d1","Type":"ContainerDied","Data":"2d5fbce719e33e8cfe366c025168b94386f18db8b8fcc08f9fdf893fee26302f"} Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.858821 4948 scope.go:117] "RemoveContainer" containerID="64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.858843 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.878669 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-log-httpd\") pod \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.878749 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4zb8\" (UniqueName: \"kubernetes.io/projected/d845ad24-e30a-41e2-8a0b-6812b49b91d1-kube-api-access-z4zb8\") pod \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.878879 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-config-data\") pod \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.878915 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-run-httpd\") pod \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.879011 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-sg-core-conf-yaml\") pod \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.879108 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-combined-ca-bundle\") pod \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.879168 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-scripts\") pod \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\" (UID: \"d845ad24-e30a-41e2-8a0b-6812b49b91d1\") " Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.879567 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d845ad24-e30a-41e2-8a0b-6812b49b91d1" (UID: "d845ad24-e30a-41e2-8a0b-6812b49b91d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.879927 4948 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.882377 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d845ad24-e30a-41e2-8a0b-6812b49b91d1" (UID: "d845ad24-e30a-41e2-8a0b-6812b49b91d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.882671 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.88264867 podStartE2EDuration="3.88264867s" podCreationTimestamp="2025-12-04 17:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:44.873709257 +0000 UTC m=+1876.234783679" watchObservedRunningTime="2025-12-04 17:57:44.88264867 +0000 UTC m=+1876.243723082" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.884704 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-scripts" (OuterVolumeSpecName: "scripts") pod "d845ad24-e30a-41e2-8a0b-6812b49b91d1" (UID: "d845ad24-e30a-41e2-8a0b-6812b49b91d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.890265 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d845ad24-e30a-41e2-8a0b-6812b49b91d1-kube-api-access-z4zb8" (OuterVolumeSpecName: "kube-api-access-z4zb8") pod "d845ad24-e30a-41e2-8a0b-6812b49b91d1" (UID: "d845ad24-e30a-41e2-8a0b-6812b49b91d1"). InnerVolumeSpecName "kube-api-access-z4zb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.921808 4948 scope.go:117] "RemoveContainer" containerID="b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.937739 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d845ad24-e30a-41e2-8a0b-6812b49b91d1" (UID: "d845ad24-e30a-41e2-8a0b-6812b49b91d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.973294 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d845ad24-e30a-41e2-8a0b-6812b49b91d1" (UID: "d845ad24-e30a-41e2-8a0b-6812b49b91d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.981212 4948 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.981244 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.981253 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.981266 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4zb8\" (UniqueName: \"kubernetes.io/projected/d845ad24-e30a-41e2-8a0b-6812b49b91d1-kube-api-access-z4zb8\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.981276 4948 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d845ad24-e30a-41e2-8a0b-6812b49b91d1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.989162 4948 scope.go:117] "RemoveContainer" containerID="a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af" Dec 04 17:57:44 crc kubenswrapper[4948]: I1204 17:57:44.997136 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-config-data" (OuterVolumeSpecName: "config-data") pod "d845ad24-e30a-41e2-8a0b-6812b49b91d1" (UID: "d845ad24-e30a-41e2-8a0b-6812b49b91d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.010957 4948 scope.go:117] "RemoveContainer" containerID="5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.031493 4948 scope.go:117] "RemoveContainer" containerID="64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1" Dec 04 17:57:45 crc kubenswrapper[4948]: E1204 17:57:45.032092 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1\": container with ID starting with 64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1 not found: ID does not exist" containerID="64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.032140 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1"} err="failed to get container status \"64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1\": rpc error: code = NotFound desc = could not find container \"64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1\": container with ID starting with 64a980ad261ffff4946611e2678645afb81edabcfe18b81e044e44264d40efb1 not found: ID does not exist" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.032174 4948 scope.go:117] "RemoveContainer" containerID="b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef" Dec 04 17:57:45 crc kubenswrapper[4948]: E1204 17:57:45.032484 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef\": container with ID starting with b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef not found: ID does not exist" containerID="b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.032516 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef"} err="failed to get container status \"b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef\": rpc error: code = NotFound desc = could not find container \"b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef\": container with ID starting with b846a1e8f62efef0c30a88a1992a553a37f312c2e771ead3ea5badfaee287aef not found: ID does not exist" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.032542 4948 scope.go:117] "RemoveContainer" containerID="a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af" Dec 04 17:57:45 crc kubenswrapper[4948]: E1204 17:57:45.032867 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af\": container with ID starting with a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af not found: ID does not exist" containerID="a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.032891 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af"} err="failed to get container status \"a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af\": rpc error: code = NotFound desc = could not find container \"a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af\": container with ID starting with a4d791c5c2c07f0db5960c12f1cebbb6476ddd3ece51fe3e3eb50ad8807672af not found: ID does not exist" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.032909 4948 scope.go:117] "RemoveContainer" containerID="5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db" Dec 04 17:57:45 crc kubenswrapper[4948]: E1204 17:57:45.033269 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db\": container with ID starting with 5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db not found: ID does not exist" containerID="5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.033295 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db"} err="failed to get container status \"5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db\": rpc error: code = NotFound desc = could not find container \"5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db\": container with ID starting with 5d3d4e1fe5e3c163765aafcb89e6d4ca0c56b743fc099836a18a6728448bb2db not found: ID does not exist" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.086999 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d845ad24-e30a-41e2-8a0b-6812b49b91d1-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.233267 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.261214 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.270397 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:57:45 crc kubenswrapper[4948]: E1204 17:57:45.270806 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="sg-core" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.270827 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="sg-core" Dec 04 17:57:45 crc kubenswrapper[4948]: E1204 17:57:45.270864 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="ceilometer-notification-agent" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.270871 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="ceilometer-notification-agent" Dec 04 17:57:45 crc kubenswrapper[4948]: E1204 17:57:45.270882 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="ceilometer-central-agent" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.270888 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="ceilometer-central-agent" Dec 04 17:57:45 crc kubenswrapper[4948]: E1204 17:57:45.270904 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="proxy-httpd" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.270909 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="proxy-httpd" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.271075 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="sg-core" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.271092 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="proxy-httpd" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.271107 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="ceilometer-notification-agent" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.271119 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" containerName="ceilometer-central-agent" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.273289 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.276632 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.280354 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.280697 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.392318 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-run-httpd\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.392370 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-config-data\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.392449 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.392483 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-log-httpd\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.392644 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rz7l\" (UniqueName: \"kubernetes.io/projected/f1642d1b-1757-4936-973c-52bbc11672ea-kube-api-access-4rz7l\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.392891 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-scripts\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.393219 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.495276 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rz7l\" (UniqueName: \"kubernetes.io/projected/f1642d1b-1757-4936-973c-52bbc11672ea-kube-api-access-4rz7l\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.495429 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-scripts\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.495588 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.495665 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-run-httpd\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.495707 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-config-data\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.495773 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.495847 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-log-httpd\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.496389 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-log-httpd\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.496633 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-run-httpd\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.499793 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.500166 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-scripts\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.500962 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-config-data\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.505878 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.511946 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rz7l\" (UniqueName: \"kubernetes.io/projected/f1642d1b-1757-4936-973c-52bbc11672ea-kube-api-access-4rz7l\") pod \"ceilometer-0\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.591333 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.789954 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.989290 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:57:45 crc kubenswrapper[4948]: I1204 17:57:45.990653 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.113092 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.126981 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-zv76w"] Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.127294 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" podUID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" containerName="dnsmasq-dns" containerID="cri-o://a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4" gracePeriod=10 Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.162000 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.639037 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.826870 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7g4z\" (UniqueName: \"kubernetes.io/projected/7fa9312c-3146-4a5e-9db6-acc251aa60c6-kube-api-access-x7g4z\") pod \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.827028 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-sb\") pod \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.827232 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-swift-storage-0\") pod \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.827499 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-svc\") pod \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.828478 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb\") pod \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.828587 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config\") pod \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.835085 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa9312c-3146-4a5e-9db6-acc251aa60c6-kube-api-access-x7g4z" (OuterVolumeSpecName: "kube-api-access-x7g4z") pod "7fa9312c-3146-4a5e-9db6-acc251aa60c6" (UID: "7fa9312c-3146-4a5e-9db6-acc251aa60c6"). InnerVolumeSpecName "kube-api-access-x7g4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.886650 4948 generic.go:334] "Generic (PLEG): container finished" podID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" containerID="a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4" exitCode=0 Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.886721 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" event={"ID":"7fa9312c-3146-4a5e-9db6-acc251aa60c6","Type":"ContainerDied","Data":"a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4"} Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.886753 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" event={"ID":"7fa9312c-3146-4a5e-9db6-acc251aa60c6","Type":"ContainerDied","Data":"0c417f06ae01252942678a7625b97553ed5bcbea9c3576045aee870c28c81cc4"} Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.886774 4948 scope.go:117] "RemoveContainer" containerID="a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.886899 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-zv76w" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.891493 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerName="cinder-scheduler" containerID="cri-o://cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551" gracePeriod=30 Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.891713 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerStarted","Data":"a0a9f479c1e16f7884e97cd89718d38bf6ce03f824db91381c662a9fb8c57a5b"} Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.891806 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerName="probe" containerID="cri-o://dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995" gracePeriod=30 Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.892689 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fa9312c-3146-4a5e-9db6-acc251aa60c6" (UID: "7fa9312c-3146-4a5e-9db6-acc251aa60c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.912388 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7fa9312c-3146-4a5e-9db6-acc251aa60c6" (UID: "7fa9312c-3146-4a5e-9db6-acc251aa60c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:46 crc kubenswrapper[4948]: E1204 17:57:46.919541 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config podName:7fa9312c-3146-4a5e-9db6-acc251aa60c6 nodeName:}" failed. No retries permitted until 2025-12-04 17:57:47.419512239 +0000 UTC m=+1878.780586641 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config") pod "7fa9312c-3146-4a5e-9db6-acc251aa60c6" (UID: "7fa9312c-3146-4a5e-9db6-acc251aa60c6") : error deleting /var/lib/kubelet/pods/7fa9312c-3146-4a5e-9db6-acc251aa60c6/volume-subpaths: remove /var/lib/kubelet/pods/7fa9312c-3146-4a5e-9db6-acc251aa60c6/volume-subpaths: no such file or directory Dec 04 17:57:46 crc kubenswrapper[4948]: E1204 17:57:46.919676 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb podName:7fa9312c-3146-4a5e-9db6-acc251aa60c6 nodeName:}" failed. No retries permitted until 2025-12-04 17:57:47.41957357 +0000 UTC m=+1878.780647972 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb") pod "7fa9312c-3146-4a5e-9db6-acc251aa60c6" (UID: "7fa9312c-3146-4a5e-9db6-acc251aa60c6") : error deleting /var/lib/kubelet/pods/7fa9312c-3146-4a5e-9db6-acc251aa60c6/volume-subpaths: remove /var/lib/kubelet/pods/7fa9312c-3146-4a5e-9db6-acc251aa60c6/volume-subpaths: no such file or directory Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.919800 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7fa9312c-3146-4a5e-9db6-acc251aa60c6" (UID: "7fa9312c-3146-4a5e-9db6-acc251aa60c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.926337 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d845ad24-e30a-41e2-8a0b-6812b49b91d1" path="/var/lib/kubelet/pods/d845ad24-e30a-41e2-8a0b-6812b49b91d1/volumes" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.931913 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.931947 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7g4z\" (UniqueName: \"kubernetes.io/projected/7fa9312c-3146-4a5e-9db6-acc251aa60c6-kube-api-access-x7g4z\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.931959 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.931973 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:46 crc kubenswrapper[4948]: I1204 17:57:46.972890 4948 scope.go:117] "RemoveContainer" containerID="0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:46.999975 4948 scope.go:117] "RemoveContainer" containerID="a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4" Dec 04 17:57:47 crc kubenswrapper[4948]: E1204 17:57:47.000563 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4\": container with ID starting with a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4 not found: ID does not exist" containerID="a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.000608 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4"} err="failed to get container status \"a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4\": rpc error: code = NotFound desc = could not find container \"a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4\": container with ID starting with a85538fe4d45db8924b9e542f8495d43bc042782caf800de95bd65d8919471e4 not found: ID does not exist" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.000633 4948 scope.go:117] "RemoveContainer" containerID="0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e" Dec 04 17:57:47 crc kubenswrapper[4948]: E1204 17:57:47.000932 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e\": container with ID starting with 0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e not found: ID does not exist" containerID="0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.000947 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e"} err="failed to get container status \"0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e\": rpc error: code = NotFound desc = could not find container \"0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e\": container with ID starting with 0214368fefd04c4c658e33d49677f1713afa30d22319ea44048e8eef31bdcf2e not found: ID does not exist" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.440837 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb\") pod \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.441311 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config\") pod \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\" (UID: \"7fa9312c-3146-4a5e-9db6-acc251aa60c6\") " Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.442945 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config" (OuterVolumeSpecName: "config") pod "7fa9312c-3146-4a5e-9db6-acc251aa60c6" (UID: "7fa9312c-3146-4a5e-9db6-acc251aa60c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.444973 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7fa9312c-3146-4a5e-9db6-acc251aa60c6" (UID: "7fa9312c-3146-4a5e-9db6-acc251aa60c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.543661 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.543693 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa9312c-3146-4a5e-9db6-acc251aa60c6-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.625402 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-zv76w"] Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.632636 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-zv76w"] Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.900813 4948 generic.go:334] "Generic (PLEG): container finished" podID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerID="dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995" exitCode=0 Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.901105 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d4b458e9-976a-4599-8b47-9f9c368eff65","Type":"ContainerDied","Data":"dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995"} Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.902710 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerStarted","Data":"ca448219170cf74e4c666ed20421a892937f95be03019f5a005dac72be5e7661"} Dec 04 17:57:47 crc kubenswrapper[4948]: I1204 17:57:47.902733 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerStarted","Data":"ac53562bbf54166812bd7a0b73efda3bf8f6f97111f3ec2e67b80d73e34962d8"} Dec 04 17:57:48 crc kubenswrapper[4948]: I1204 17:57:48.934463 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" path="/var/lib/kubelet/pods/7fa9312c-3146-4a5e-9db6-acc251aa60c6/volumes" Dec 04 17:57:48 crc kubenswrapper[4948]: I1204 17:57:48.935395 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerStarted","Data":"9a60739ff31490e058856809f63cd9767cc1b7d3ad856223e87883204b670f63"} Dec 04 17:57:50 crc kubenswrapper[4948]: I1204 17:57:50.914134 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 17:57:50 crc kubenswrapper[4948]: I1204 17:57:50.934462 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerStarted","Data":"f48600b73efa978b634d7b269e3abe48151d3ea30ac6e6b8f5e2b6f49bc2196f"} Dec 04 17:57:50 crc kubenswrapper[4948]: I1204 17:57:50.935171 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 17:57:50 crc kubenswrapper[4948]: I1204 17:57:50.959569 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.782769231 podStartE2EDuration="5.959547874s" podCreationTimestamp="2025-12-04 17:57:45 +0000 UTC" firstStartedPulling="2025-12-04 17:57:46.13417913 +0000 UTC m=+1877.495253532" lastFinishedPulling="2025-12-04 17:57:50.310957773 +0000 UTC m=+1881.672032175" observedRunningTime="2025-12-04 17:57:50.956497579 +0000 UTC m=+1882.317571991" watchObservedRunningTime="2025-12-04 17:57:50.959547874 +0000 UTC m=+1882.320622276" Dec 04 17:57:50 crc kubenswrapper[4948]: I1204 17:57:50.994523 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-54df7858f8-fz456" Dec 04 17:57:51 crc kubenswrapper[4948]: I1204 17:57:51.945924 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"2179a66ea554870aee48aa7049abeb21ba84072bb2764a52d1cc7c10e4f11e50"} Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.819777 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.934995 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-combined-ca-bundle\") pod \"d4b458e9-976a-4599-8b47-9f9c368eff65\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.935028 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data-custom\") pod \"d4b458e9-976a-4599-8b47-9f9c368eff65\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.935085 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data\") pod \"d4b458e9-976a-4599-8b47-9f9c368eff65\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.935129 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-scripts\") pod \"d4b458e9-976a-4599-8b47-9f9c368eff65\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.935173 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4b458e9-976a-4599-8b47-9f9c368eff65-etc-machine-id\") pod \"d4b458e9-976a-4599-8b47-9f9c368eff65\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.935260 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlx8s\" (UniqueName: \"kubernetes.io/projected/d4b458e9-976a-4599-8b47-9f9c368eff65-kube-api-access-hlx8s\") pod \"d4b458e9-976a-4599-8b47-9f9c368eff65\" (UID: \"d4b458e9-976a-4599-8b47-9f9c368eff65\") " Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.935338 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4b458e9-976a-4599-8b47-9f9c368eff65-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d4b458e9-976a-4599-8b47-9f9c368eff65" (UID: "d4b458e9-976a-4599-8b47-9f9c368eff65"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.935576 4948 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4b458e9-976a-4599-8b47-9f9c368eff65-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.940924 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4b458e9-976a-4599-8b47-9f9c368eff65" (UID: "d4b458e9-976a-4599-8b47-9f9c368eff65"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.942195 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4b458e9-976a-4599-8b47-9f9c368eff65-kube-api-access-hlx8s" (OuterVolumeSpecName: "kube-api-access-hlx8s") pod "d4b458e9-976a-4599-8b47-9f9c368eff65" (UID: "d4b458e9-976a-4599-8b47-9f9c368eff65"). InnerVolumeSpecName "kube-api-access-hlx8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.945136 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-scripts" (OuterVolumeSpecName: "scripts") pod "d4b458e9-976a-4599-8b47-9f9c368eff65" (UID: "d4b458e9-976a-4599-8b47-9f9c368eff65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.954463 4948 generic.go:334] "Generic (PLEG): container finished" podID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerID="cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551" exitCode=0 Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.954503 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d4b458e9-976a-4599-8b47-9f9c368eff65","Type":"ContainerDied","Data":"cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551"} Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.954530 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d4b458e9-976a-4599-8b47-9f9c368eff65","Type":"ContainerDied","Data":"bb058617ba8e96267722ab0dc226ff5275d3f7610a8d32e0e13c1a7bd7e92c02"} Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.954546 4948 scope.go:117] "RemoveContainer" containerID="dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995" Dec 04 17:57:52 crc kubenswrapper[4948]: I1204 17:57:52.954655 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.028150 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4b458e9-976a-4599-8b47-9f9c368eff65" (UID: "d4b458e9-976a-4599-8b47-9f9c368eff65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.036782 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.036807 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlx8s\" (UniqueName: \"kubernetes.io/projected/d4b458e9-976a-4599-8b47-9f9c368eff65-kube-api-access-hlx8s\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.036818 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.036828 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.072160 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data" (OuterVolumeSpecName: "config-data") pod "d4b458e9-976a-4599-8b47-9f9c368eff65" (UID: "d4b458e9-976a-4599-8b47-9f9c368eff65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.138892 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4b458e9-976a-4599-8b47-9f9c368eff65-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.155665 4948 scope.go:117] "RemoveContainer" containerID="cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.176619 4948 scope.go:117] "RemoveContainer" containerID="dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995" Dec 04 17:57:53 crc kubenswrapper[4948]: E1204 17:57:53.177122 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995\": container with ID starting with dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995 not found: ID does not exist" containerID="dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.177156 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995"} err="failed to get container status \"dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995\": rpc error: code = NotFound desc = could not find container \"dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995\": container with ID starting with dc5aa285c12e2280c5061047fbcfc87f38139a7fde69f190617794fe1b78c995 not found: ID does not exist" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.177179 4948 scope.go:117] "RemoveContainer" containerID="cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551" Dec 04 17:57:53 crc kubenswrapper[4948]: E1204 17:57:53.177499 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551\": container with ID starting with cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551 not found: ID does not exist" containerID="cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.177518 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551"} err="failed to get container status \"cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551\": rpc error: code = NotFound desc = could not find container \"cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551\": container with ID starting with cc06d92ce9aee39770cbd663e897b0a050ae5c41c657eff2ae10f501004bf551 not found: ID does not exist" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.296809 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.318238 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.339010 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:53 crc kubenswrapper[4948]: E1204 17:57:53.339571 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" containerName="dnsmasq-dns" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.339588 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" containerName="dnsmasq-dns" Dec 04 17:57:53 crc kubenswrapper[4948]: E1204 17:57:53.339627 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" containerName="init" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.339635 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" containerName="init" Dec 04 17:57:53 crc kubenswrapper[4948]: E1204 17:57:53.339652 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerName="cinder-scheduler" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.339659 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerName="cinder-scheduler" Dec 04 17:57:53 crc kubenswrapper[4948]: E1204 17:57:53.339672 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerName="probe" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.339680 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerName="probe" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.339891 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa9312c-3146-4a5e-9db6-acc251aa60c6" containerName="dnsmasq-dns" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.339902 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerName="probe" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.339911 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" containerName="cinder-scheduler" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.342170 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.349487 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.351992 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.444275 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdac4fb3-a888-4781-b1e0-99630c84fe0f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.444540 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.444690 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.444839 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz2cw\" (UniqueName: \"kubernetes.io/projected/cdac4fb3-a888-4781-b1e0-99630c84fe0f-kube-api-access-jz2cw\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.444963 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.445096 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-scripts\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.546951 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.548168 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.548310 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz2cw\" (UniqueName: \"kubernetes.io/projected/cdac4fb3-a888-4781-b1e0-99630c84fe0f-kube-api-access-jz2cw\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.548464 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.548603 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-scripts\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.548784 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdac4fb3-a888-4781-b1e0-99630c84fe0f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.548979 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdac4fb3-a888-4781-b1e0-99630c84fe0f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.553407 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.553826 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.554287 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.558560 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-scripts\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.567664 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz2cw\" (UniqueName: \"kubernetes.io/projected/cdac4fb3-a888-4781-b1e0-99630c84fe0f-kube-api-access-jz2cw\") pod \"cinder-scheduler-0\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " pod="openstack/cinder-scheduler-0" Dec 04 17:57:53 crc kubenswrapper[4948]: I1204 17:57:53.665708 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 17:57:54 crc kubenswrapper[4948]: I1204 17:57:54.138364 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 17:57:54 crc kubenswrapper[4948]: W1204 17:57:54.140879 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdac4fb3_a888_4781_b1e0_99630c84fe0f.slice/crio-14825261de269435ce1738d5aee1d307e7f78eb8bc1ae29be0c071bdd46bd699 WatchSource:0}: Error finding container 14825261de269435ce1738d5aee1d307e7f78eb8bc1ae29be0c071bdd46bd699: Status 404 returned error can't find the container with id 14825261de269435ce1738d5aee1d307e7f78eb8bc1ae29be0c071bdd46bd699 Dec 04 17:57:54 crc kubenswrapper[4948]: I1204 17:57:54.555215 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 17:57:54 crc kubenswrapper[4948]: I1204 17:57:54.924405 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4b458e9-976a-4599-8b47-9f9c368eff65" path="/var/lib/kubelet/pods/d4b458e9-976a-4599-8b47-9f9c368eff65/volumes" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.011478 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cdac4fb3-a888-4781-b1e0-99630c84fe0f","Type":"ContainerStarted","Data":"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc"} Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.011529 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cdac4fb3-a888-4781-b1e0-99630c84fe0f","Type":"ContainerStarted","Data":"14825261de269435ce1738d5aee1d307e7f78eb8bc1ae29be0c071bdd46bd699"} Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.166606 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.167985 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.172394 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.174484 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rz72z" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.179388 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.207106 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.288130 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.288216 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config-secret\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.288260 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk6dj\" (UniqueName: \"kubernetes.io/projected/52745282-c8f7-4bfd-872c-4ece0c381002-kube-api-access-wk6dj\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.288350 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.389971 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.390056 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config-secret\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.390090 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk6dj\" (UniqueName: \"kubernetes.io/projected/52745282-c8f7-4bfd-872c-4ece0c381002-kube-api-access-wk6dj\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.390169 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.391668 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.398979 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-combined-ca-bundle\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.401492 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config-secret\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.409528 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk6dj\" (UniqueName: \"kubernetes.io/projected/52745282-c8f7-4bfd-872c-4ece0c381002-kube-api-access-wk6dj\") pod \"openstackclient\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.514360 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.515019 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.525672 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.553488 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.557102 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.582540 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 17:57:55 crc kubenswrapper[4948]: E1204 17:57:55.691781 4948 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 04 17:57:55 crc kubenswrapper[4948]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_52745282-c8f7-4bfd-872c-4ece0c381002_0(556fbac41321e2bacb503070043a386d1f37cd2803300f5993d1df231fe3916c): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"556fbac41321e2bacb503070043a386d1f37cd2803300f5993d1df231fe3916c" Netns:"/var/run/netns/58874a68-8ab1-4a09-8f50-2e580c701587" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=556fbac41321e2bacb503070043a386d1f37cd2803300f5993d1df231fe3916c;K8S_POD_UID=52745282-c8f7-4bfd-872c-4ece0c381002" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/52745282-c8f7-4bfd-872c-4ece0c381002]: expected pod UID "52745282-c8f7-4bfd-872c-4ece0c381002" but got "9c0787d1-2fd6-4c5c-8e07-44bcbab37320" from Kube API Dec 04 17:57:55 crc kubenswrapper[4948]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 17:57:55 crc kubenswrapper[4948]: > Dec 04 17:57:55 crc kubenswrapper[4948]: E1204 17:57:55.692117 4948 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 04 17:57:55 crc kubenswrapper[4948]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_52745282-c8f7-4bfd-872c-4ece0c381002_0(556fbac41321e2bacb503070043a386d1f37cd2803300f5993d1df231fe3916c): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"556fbac41321e2bacb503070043a386d1f37cd2803300f5993d1df231fe3916c" Netns:"/var/run/netns/58874a68-8ab1-4a09-8f50-2e580c701587" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=556fbac41321e2bacb503070043a386d1f37cd2803300f5993d1df231fe3916c;K8S_POD_UID=52745282-c8f7-4bfd-872c-4ece0c381002" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/52745282-c8f7-4bfd-872c-4ece0c381002]: expected pod UID "52745282-c8f7-4bfd-872c-4ece0c381002" but got "9c0787d1-2fd6-4c5c-8e07-44bcbab37320" from Kube API Dec 04 17:57:55 crc kubenswrapper[4948]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 04 17:57:55 crc kubenswrapper[4948]: > pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.696011 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config-secret\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.696088 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.696130 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.696211 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdrlm\" (UniqueName: \"kubernetes.io/projected/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-kube-api-access-jdrlm\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.798543 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config-secret\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.798594 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.798641 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.798712 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdrlm\" (UniqueName: \"kubernetes.io/projected/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-kube-api-access-jdrlm\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.799592 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.804109 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config-secret\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.804730 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.822452 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdrlm\" (UniqueName: \"kubernetes.io/projected/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-kube-api-access-jdrlm\") pod \"openstackclient\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " pod="openstack/openstackclient" Dec 04 17:57:55 crc kubenswrapper[4948]: I1204 17:57:55.966253 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.040074 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.040114 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cdac4fb3-a888-4781-b1e0-99630c84fe0f","Type":"ContainerStarted","Data":"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265"} Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.045717 4948 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="52745282-c8f7-4bfd-872c-4ece0c381002" podUID="9c0787d1-2fd6-4c5c-8e07-44bcbab37320" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.080215 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.080197684 podStartE2EDuration="3.080197684s" podCreationTimestamp="2025-12-04 17:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:57:56.072583049 +0000 UTC m=+1887.433657461" watchObservedRunningTime="2025-12-04 17:57:56.080197684 +0000 UTC m=+1887.441272086" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.086658 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.203821 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config\") pod \"52745282-c8f7-4bfd-872c-4ece0c381002\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.204261 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-combined-ca-bundle\") pod \"52745282-c8f7-4bfd-872c-4ece0c381002\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.204334 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk6dj\" (UniqueName: \"kubernetes.io/projected/52745282-c8f7-4bfd-872c-4ece0c381002-kube-api-access-wk6dj\") pod \"52745282-c8f7-4bfd-872c-4ece0c381002\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.204401 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config-secret\") pod \"52745282-c8f7-4bfd-872c-4ece0c381002\" (UID: \"52745282-c8f7-4bfd-872c-4ece0c381002\") " Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.204551 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "52745282-c8f7-4bfd-872c-4ece0c381002" (UID: "52745282-c8f7-4bfd-872c-4ece0c381002"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.205018 4948 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.222843 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52745282-c8f7-4bfd-872c-4ece0c381002-kube-api-access-wk6dj" (OuterVolumeSpecName: "kube-api-access-wk6dj") pod "52745282-c8f7-4bfd-872c-4ece0c381002" (UID: "52745282-c8f7-4bfd-872c-4ece0c381002"). InnerVolumeSpecName "kube-api-access-wk6dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.222977 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "52745282-c8f7-4bfd-872c-4ece0c381002" (UID: "52745282-c8f7-4bfd-872c-4ece0c381002"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.225305 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52745282-c8f7-4bfd-872c-4ece0c381002" (UID: "52745282-c8f7-4bfd-872c-4ece0c381002"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.309219 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.309263 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk6dj\" (UniqueName: \"kubernetes.io/projected/52745282-c8f7-4bfd-872c-4ece0c381002-kube-api-access-wk6dj\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.309275 4948 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/52745282-c8f7-4bfd-872c-4ece0c381002-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.507605 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 17:57:56 crc kubenswrapper[4948]: W1204 17:57:56.511729 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c0787d1_2fd6_4c5c_8e07_44bcbab37320.slice/crio-f8cad91c4c57a7adc5c7c0225ef7daefe74fbcda786eea5d377d0ae4772c138e WatchSource:0}: Error finding container f8cad91c4c57a7adc5c7c0225ef7daefe74fbcda786eea5d377d0ae4772c138e: Status 404 returned error can't find the container with id f8cad91c4c57a7adc5c7c0225ef7daefe74fbcda786eea5d377d0ae4772c138e Dec 04 17:57:56 crc kubenswrapper[4948]: I1204 17:57:56.923233 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52745282-c8f7-4bfd-872c-4ece0c381002" path="/var/lib/kubelet/pods/52745282-c8f7-4bfd-872c-4ece0c381002/volumes" Dec 04 17:57:57 crc kubenswrapper[4948]: I1204 17:57:57.049007 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9c0787d1-2fd6-4c5c-8e07-44bcbab37320","Type":"ContainerStarted","Data":"f8cad91c4c57a7adc5c7c0225ef7daefe74fbcda786eea5d377d0ae4772c138e"} Dec 04 17:57:57 crc kubenswrapper[4948]: I1204 17:57:57.049031 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 17:57:57 crc kubenswrapper[4948]: I1204 17:57:57.058087 4948 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="52745282-c8f7-4bfd-872c-4ece0c381002" podUID="9c0787d1-2fd6-4c5c-8e07-44bcbab37320" Dec 04 17:57:57 crc kubenswrapper[4948]: I1204 17:57:57.345618 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:57 crc kubenswrapper[4948]: I1204 17:57:57.346595 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 17:57:58 crc kubenswrapper[4948]: I1204 17:57:58.668162 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 17:57:58 crc kubenswrapper[4948]: I1204 17:57:58.911657 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-79fbd4d98c-8tdt7"] Dec 04 17:57:58 crc kubenswrapper[4948]: I1204 17:57:58.913335 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:58 crc kubenswrapper[4948]: I1204 17:57:58.932908 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 17:57:58 crc kubenswrapper[4948]: I1204 17:57:58.933183 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 04 17:57:58 crc kubenswrapper[4948]: I1204 17:57:58.933325 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 04 17:57:58 crc kubenswrapper[4948]: I1204 17:57:58.956232 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-79fbd4d98c-8tdt7"] Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.058880 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-config-data\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.059023 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-log-httpd\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.059113 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-internal-tls-certs\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.059145 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-combined-ca-bundle\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.059223 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-etc-swift\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.059946 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-public-tls-certs\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.060064 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpds8\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-kube-api-access-rpds8\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.060108 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-run-httpd\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.161247 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-run-httpd\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.161325 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-config-data\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.161386 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-log-httpd\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.161425 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-internal-tls-certs\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.161442 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-combined-ca-bundle\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.161481 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-etc-swift\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.161534 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-public-tls-certs\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.161558 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpds8\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-kube-api-access-rpds8\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.168303 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-internal-tls-certs\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.168592 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-public-tls-certs\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.168799 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-combined-ca-bundle\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.172206 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-log-httpd\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.172341 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-run-httpd\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.173209 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-config-data\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.177097 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-etc-swift\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.179394 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpds8\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-kube-api-access-rpds8\") pod \"swift-proxy-79fbd4d98c-8tdt7\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.256270 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.705492 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.706293 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="proxy-httpd" containerID="cri-o://f48600b73efa978b634d7b269e3abe48151d3ea30ac6e6b8f5e2b6f49bc2196f" gracePeriod=30 Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.706770 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="ceilometer-central-agent" containerID="cri-o://ac53562bbf54166812bd7a0b73efda3bf8f6f97111f3ec2e67b80d73e34962d8" gracePeriod=30 Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.706819 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="sg-core" containerID="cri-o://9a60739ff31490e058856809f63cd9767cc1b7d3ad856223e87883204b670f63" gracePeriod=30 Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.706850 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="ceilometer-notification-agent" containerID="cri-o://ca448219170cf74e4c666ed20421a892937f95be03019f5a005dac72be5e7661" gracePeriod=30 Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.737165 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.737382 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-log" containerID="cri-o://dfd1336b1739747d81929cc4e0ad9e6bb803c019fe5011d59f6ebad326abb1c8" gracePeriod=30 Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.738020 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-httpd" containerID="cri-o://9408661d75bd4de856da2a30fda585946339e006d4ebeff98d7f7fa2bb71d74b" gracePeriod=30 Dec 04 17:57:59 crc kubenswrapper[4948]: I1204 17:57:59.908402 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-79fbd4d98c-8tdt7"] Dec 04 17:57:59 crc kubenswrapper[4948]: W1204 17:57:59.910900 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89ecb28d_b878_4b16_a46a_9d9be1441aca.slice/crio-0982a9d289d1595dfc873dce3bf4796b152b1f5667880fda83ced6877fcbfd94 WatchSource:0}: Error finding container 0982a9d289d1595dfc873dce3bf4796b152b1f5667880fda83ced6877fcbfd94: Status 404 returned error can't find the container with id 0982a9d289d1595dfc873dce3bf4796b152b1f5667880fda83ced6877fcbfd94 Dec 04 17:58:00 crc kubenswrapper[4948]: I1204 17:58:00.089356 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" event={"ID":"89ecb28d-b878-4b16-a46a-9d9be1441aca","Type":"ContainerStarted","Data":"0982a9d289d1595dfc873dce3bf4796b152b1f5667880fda83ced6877fcbfd94"} Dec 04 17:58:00 crc kubenswrapper[4948]: I1204 17:58:00.094075 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1cb425a-165a-4ba6-9316-3b8954b2b395","Type":"ContainerDied","Data":"dfd1336b1739747d81929cc4e0ad9e6bb803c019fe5011d59f6ebad326abb1c8"} Dec 04 17:58:00 crc kubenswrapper[4948]: I1204 17:58:00.094023 4948 generic.go:334] "Generic (PLEG): container finished" podID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerID="dfd1336b1739747d81929cc4e0ad9e6bb803c019fe5011d59f6ebad326abb1c8" exitCode=143 Dec 04 17:58:00 crc kubenswrapper[4948]: I1204 17:58:00.097641 4948 generic.go:334] "Generic (PLEG): container finished" podID="f1642d1b-1757-4936-973c-52bbc11672ea" containerID="f48600b73efa978b634d7b269e3abe48151d3ea30ac6e6b8f5e2b6f49bc2196f" exitCode=0 Dec 04 17:58:00 crc kubenswrapper[4948]: I1204 17:58:00.097669 4948 generic.go:334] "Generic (PLEG): container finished" podID="f1642d1b-1757-4936-973c-52bbc11672ea" containerID="9a60739ff31490e058856809f63cd9767cc1b7d3ad856223e87883204b670f63" exitCode=2 Dec 04 17:58:00 crc kubenswrapper[4948]: I1204 17:58:00.097685 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerDied","Data":"f48600b73efa978b634d7b269e3abe48151d3ea30ac6e6b8f5e2b6f49bc2196f"} Dec 04 17:58:00 crc kubenswrapper[4948]: I1204 17:58:00.097705 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerDied","Data":"9a60739ff31490e058856809f63cd9767cc1b7d3ad856223e87883204b670f63"} Dec 04 17:58:01 crc kubenswrapper[4948]: I1204 17:58:01.112737 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" event={"ID":"89ecb28d-b878-4b16-a46a-9d9be1441aca","Type":"ContainerStarted","Data":"a43a2bd6b5a97a2d0c5ce373003143582211cc9d3bf7982dbc10ceb659d870a6"} Dec 04 17:58:01 crc kubenswrapper[4948]: I1204 17:58:01.113273 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" event={"ID":"89ecb28d-b878-4b16-a46a-9d9be1441aca","Type":"ContainerStarted","Data":"ab060077462d8ce9f643db68a3d7c266453bc9728c23786c15ef088ebea997bf"} Dec 04 17:58:01 crc kubenswrapper[4948]: I1204 17:58:01.113291 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:58:01 crc kubenswrapper[4948]: I1204 17:58:01.113302 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:58:01 crc kubenswrapper[4948]: I1204 17:58:01.117456 4948 generic.go:334] "Generic (PLEG): container finished" podID="f1642d1b-1757-4936-973c-52bbc11672ea" containerID="ac53562bbf54166812bd7a0b73efda3bf8f6f97111f3ec2e67b80d73e34962d8" exitCode=0 Dec 04 17:58:01 crc kubenswrapper[4948]: I1204 17:58:01.117494 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerDied","Data":"ac53562bbf54166812bd7a0b73efda3bf8f6f97111f3ec2e67b80d73e34962d8"} Dec 04 17:58:01 crc kubenswrapper[4948]: I1204 17:58:01.137996 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" podStartSLOduration=3.13798129 podStartE2EDuration="3.13798129s" podCreationTimestamp="2025-12-04 17:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:01.134937926 +0000 UTC m=+1892.496012328" watchObservedRunningTime="2025-12-04 17:58:01.13798129 +0000 UTC m=+1892.499055692" Dec 04 17:58:03 crc kubenswrapper[4948]: I1204 17:58:03.136317 4948 generic.go:334] "Generic (PLEG): container finished" podID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerID="9408661d75bd4de856da2a30fda585946339e006d4ebeff98d7f7fa2bb71d74b" exitCode=0 Dec 04 17:58:03 crc kubenswrapper[4948]: I1204 17:58:03.136405 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1cb425a-165a-4ba6-9316-3b8954b2b395","Type":"ContainerDied","Data":"9408661d75bd4de856da2a30fda585946339e006d4ebeff98d7f7fa2bb71d74b"} Dec 04 17:58:03 crc kubenswrapper[4948]: I1204 17:58:03.631205 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.152:9292/healthcheck\": dial tcp 10.217.0.152:9292: connect: connection refused" Dec 04 17:58:03 crc kubenswrapper[4948]: I1204 17:58:03.631239 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9292/healthcheck\": dial tcp 10.217.0.152:9292: connect: connection refused" Dec 04 17:58:03 crc kubenswrapper[4948]: I1204 17:58:03.884230 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 17:58:04 crc kubenswrapper[4948]: I1204 17:58:04.149281 4948 generic.go:334] "Generic (PLEG): container finished" podID="f1642d1b-1757-4936-973c-52bbc11672ea" containerID="ca448219170cf74e4c666ed20421a892937f95be03019f5a005dac72be5e7661" exitCode=0 Dec 04 17:58:04 crc kubenswrapper[4948]: I1204 17:58:04.149329 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerDied","Data":"ca448219170cf74e4c666ed20421a892937f95be03019f5a005dac72be5e7661"} Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.622359 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-snxlb"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.623725 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.636902 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-snxlb"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.731960 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-de93-account-create-update-d8v6q"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.735995 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.738104 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.750051 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jwmbk"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.751403 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.758785 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-de93-account-create-update-d8v6q"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.767403 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jwmbk"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.784972 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltl55\" (UniqueName: \"kubernetes.io/projected/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-kube-api-access-ltl55\") pod \"nova-api-db-create-snxlb\" (UID: \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\") " pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.785098 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-operator-scripts\") pod \"nova-api-db-create-snxlb\" (UID: \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\") " pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.833882 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-z2hx2"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.835124 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.840506 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z2hx2"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.887541 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmg6\" (UniqueName: \"kubernetes.io/projected/e083d908-b647-4875-8ae1-d455db250897-kube-api-access-7rmg6\") pod \"nova-cell0-db-create-jwmbk\" (UID: \"e083d908-b647-4875-8ae1-d455db250897\") " pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.887689 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7744e322-879f-4483-b49e-019fc53973f5-operator-scripts\") pod \"nova-api-de93-account-create-update-d8v6q\" (UID: \"7744e322-879f-4483-b49e-019fc53973f5\") " pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.887727 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dmzk\" (UniqueName: \"kubernetes.io/projected/7744e322-879f-4483-b49e-019fc53973f5-kube-api-access-9dmzk\") pod \"nova-api-de93-account-create-update-d8v6q\" (UID: \"7744e322-879f-4483-b49e-019fc53973f5\") " pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.887780 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltl55\" (UniqueName: \"kubernetes.io/projected/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-kube-api-access-ltl55\") pod \"nova-api-db-create-snxlb\" (UID: \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\") " pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.887838 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e083d908-b647-4875-8ae1-d455db250897-operator-scripts\") pod \"nova-cell0-db-create-jwmbk\" (UID: \"e083d908-b647-4875-8ae1-d455db250897\") " pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.887868 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-operator-scripts\") pod \"nova-api-db-create-snxlb\" (UID: \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\") " pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.888898 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-operator-scripts\") pod \"nova-api-db-create-snxlb\" (UID: \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\") " pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.916625 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltl55\" (UniqueName: \"kubernetes.io/projected/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-kube-api-access-ltl55\") pod \"nova-api-db-create-snxlb\" (UID: \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\") " pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.940776 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a2da-account-create-update-5wtsm"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.942050 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.945891 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.957106 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a2da-account-create-update-5wtsm"] Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.980113 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.991314 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-operator-scripts\") pod \"nova-cell1-db-create-z2hx2\" (UID: \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\") " pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.991410 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzvtj\" (UniqueName: \"kubernetes.io/projected/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-kube-api-access-qzvtj\") pod \"nova-cell1-db-create-z2hx2\" (UID: \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\") " pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.991494 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7744e322-879f-4483-b49e-019fc53973f5-operator-scripts\") pod \"nova-api-de93-account-create-update-d8v6q\" (UID: \"7744e322-879f-4483-b49e-019fc53973f5\") " pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.991534 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dmzk\" (UniqueName: \"kubernetes.io/projected/7744e322-879f-4483-b49e-019fc53973f5-kube-api-access-9dmzk\") pod \"nova-api-de93-account-create-update-d8v6q\" (UID: \"7744e322-879f-4483-b49e-019fc53973f5\") " pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.991911 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e083d908-b647-4875-8ae1-d455db250897-operator-scripts\") pod \"nova-cell0-db-create-jwmbk\" (UID: \"e083d908-b647-4875-8ae1-d455db250897\") " pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.992065 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmg6\" (UniqueName: \"kubernetes.io/projected/e083d908-b647-4875-8ae1-d455db250897-kube-api-access-7rmg6\") pod \"nova-cell0-db-create-jwmbk\" (UID: \"e083d908-b647-4875-8ae1-d455db250897\") " pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.993716 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7744e322-879f-4483-b49e-019fc53973f5-operator-scripts\") pod \"nova-api-de93-account-create-update-d8v6q\" (UID: \"7744e322-879f-4483-b49e-019fc53973f5\") " pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:05 crc kubenswrapper[4948]: I1204 17:58:05.994258 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e083d908-b647-4875-8ae1-d455db250897-operator-scripts\") pod \"nova-cell0-db-create-jwmbk\" (UID: \"e083d908-b647-4875-8ae1-d455db250897\") " pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.019778 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmg6\" (UniqueName: \"kubernetes.io/projected/e083d908-b647-4875-8ae1-d455db250897-kube-api-access-7rmg6\") pod \"nova-cell0-db-create-jwmbk\" (UID: \"e083d908-b647-4875-8ae1-d455db250897\") " pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.021729 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dmzk\" (UniqueName: \"kubernetes.io/projected/7744e322-879f-4483-b49e-019fc53973f5-kube-api-access-9dmzk\") pod \"nova-api-de93-account-create-update-d8v6q\" (UID: \"7744e322-879f-4483-b49e-019fc53973f5\") " pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.058342 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.072718 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.093890 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md284\" (UniqueName: \"kubernetes.io/projected/4c5917bc-97e7-4fa9-b727-c503d616e67f-kube-api-access-md284\") pod \"nova-cell0-a2da-account-create-update-5wtsm\" (UID: \"4c5917bc-97e7-4fa9-b727-c503d616e67f\") " pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.093954 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5917bc-97e7-4fa9-b727-c503d616e67f-operator-scripts\") pod \"nova-cell0-a2da-account-create-update-5wtsm\" (UID: \"4c5917bc-97e7-4fa9-b727-c503d616e67f\") " pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.094160 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-operator-scripts\") pod \"nova-cell1-db-create-z2hx2\" (UID: \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\") " pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.094206 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzvtj\" (UniqueName: \"kubernetes.io/projected/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-kube-api-access-qzvtj\") pod \"nova-cell1-db-create-z2hx2\" (UID: \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\") " pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.094938 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-operator-scripts\") pod \"nova-cell1-db-create-z2hx2\" (UID: \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\") " pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.133615 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzvtj\" (UniqueName: \"kubernetes.io/projected/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-kube-api-access-qzvtj\") pod \"nova-cell1-db-create-z2hx2\" (UID: \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\") " pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.134233 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e276-account-create-update-99s2v"] Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.148330 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e276-account-create-update-99s2v"] Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.148447 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.151387 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.156607 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.195856 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md284\" (UniqueName: \"kubernetes.io/projected/4c5917bc-97e7-4fa9-b727-c503d616e67f-kube-api-access-md284\") pod \"nova-cell0-a2da-account-create-update-5wtsm\" (UID: \"4c5917bc-97e7-4fa9-b727-c503d616e67f\") " pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.195934 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5917bc-97e7-4fa9-b727-c503d616e67f-operator-scripts\") pod \"nova-cell0-a2da-account-create-update-5wtsm\" (UID: \"4c5917bc-97e7-4fa9-b727-c503d616e67f\") " pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.196910 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5917bc-97e7-4fa9-b727-c503d616e67f-operator-scripts\") pod \"nova-cell0-a2da-account-create-update-5wtsm\" (UID: \"4c5917bc-97e7-4fa9-b727-c503d616e67f\") " pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.205641 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.206327 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerName="glance-log" containerID="cri-o://77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0" gracePeriod=30 Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.206723 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerName="glance-httpd" containerID="cri-o://8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff" gracePeriod=30 Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.231703 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md284\" (UniqueName: \"kubernetes.io/projected/4c5917bc-97e7-4fa9-b727-c503d616e67f-kube-api-access-md284\") pod \"nova-cell0-a2da-account-create-update-5wtsm\" (UID: \"4c5917bc-97e7-4fa9-b727-c503d616e67f\") " pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.276174 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.297755 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5dc382-7aaa-4191-8605-dd03299ca26d-operator-scripts\") pod \"nova-cell1-e276-account-create-update-99s2v\" (UID: \"ff5dc382-7aaa-4191-8605-dd03299ca26d\") " pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.297911 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjt75\" (UniqueName: \"kubernetes.io/projected/ff5dc382-7aaa-4191-8605-dd03299ca26d-kube-api-access-wjt75\") pod \"nova-cell1-e276-account-create-update-99s2v\" (UID: \"ff5dc382-7aaa-4191-8605-dd03299ca26d\") " pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.399325 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5dc382-7aaa-4191-8605-dd03299ca26d-operator-scripts\") pod \"nova-cell1-e276-account-create-update-99s2v\" (UID: \"ff5dc382-7aaa-4191-8605-dd03299ca26d\") " pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.399439 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjt75\" (UniqueName: \"kubernetes.io/projected/ff5dc382-7aaa-4191-8605-dd03299ca26d-kube-api-access-wjt75\") pod \"nova-cell1-e276-account-create-update-99s2v\" (UID: \"ff5dc382-7aaa-4191-8605-dd03299ca26d\") " pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.400338 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5dc382-7aaa-4191-8605-dd03299ca26d-operator-scripts\") pod \"nova-cell1-e276-account-create-update-99s2v\" (UID: \"ff5dc382-7aaa-4191-8605-dd03299ca26d\") " pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.419688 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjt75\" (UniqueName: \"kubernetes.io/projected/ff5dc382-7aaa-4191-8605-dd03299ca26d-kube-api-access-wjt75\") pod \"nova-cell1-e276-account-create-update-99s2v\" (UID: \"ff5dc382-7aaa-4191-8605-dd03299ca26d\") " pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:06 crc kubenswrapper[4948]: I1204 17:58:06.518771 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:07 crc kubenswrapper[4948]: I1204 17:58:07.187465 4948 generic.go:334] "Generic (PLEG): container finished" podID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerID="77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0" exitCode=143 Dec 04 17:58:07 crc kubenswrapper[4948]: I1204 17:58:07.187504 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e","Type":"ContainerDied","Data":"77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0"} Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.094580 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.097115 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.188886 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-snxlb"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.207444 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1cb425a-165a-4ba6-9316-3b8954b2b395","Type":"ContainerDied","Data":"9db2ceb9c205d0c114fd59c759e49100776d4524b356a1e0554808a9dd2d66e6"} Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.207730 4948 scope.go:117] "RemoveContainer" containerID="9408661d75bd4de856da2a30fda585946339e006d4ebeff98d7f7fa2bb71d74b" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.207851 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.220732 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1642d1b-1757-4936-973c-52bbc11672ea","Type":"ContainerDied","Data":"a0a9f479c1e16f7884e97cd89718d38bf6ce03f824db91381c662a9fb8c57a5b"} Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.220845 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.227464 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9c0787d1-2fd6-4c5c-8e07-44bcbab37320","Type":"ContainerStarted","Data":"d9a6ef11482121f4a842966dc02f23660d77b8786023f6691bf3ecec31db0c0c"} Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.248602 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.151828751 podStartE2EDuration="14.248581703s" podCreationTimestamp="2025-12-04 17:57:55 +0000 UTC" firstStartedPulling="2025-12-04 17:57:56.514531452 +0000 UTC m=+1887.875605854" lastFinishedPulling="2025-12-04 17:58:08.611284404 +0000 UTC m=+1899.972358806" observedRunningTime="2025-12-04 17:58:09.246307088 +0000 UTC m=+1900.607381490" watchObservedRunningTime="2025-12-04 17:58:09.248581703 +0000 UTC m=+1900.609656105" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.268479 4948 scope.go:117] "RemoveContainer" containerID="dfd1336b1739747d81929cc4e0ad9e6bb803c019fe5011d59f6ebad326abb1c8" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269431 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269675 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"d1cb425a-165a-4ba6-9316-3b8954b2b395\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269737 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-combined-ca-bundle\") pod \"d1cb425a-165a-4ba6-9316-3b8954b2b395\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269769 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-httpd-run\") pod \"d1cb425a-165a-4ba6-9316-3b8954b2b395\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269798 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rz7l\" (UniqueName: \"kubernetes.io/projected/f1642d1b-1757-4936-973c-52bbc11672ea-kube-api-access-4rz7l\") pod \"f1642d1b-1757-4936-973c-52bbc11672ea\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269819 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-scripts\") pod \"d1cb425a-165a-4ba6-9316-3b8954b2b395\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269864 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mxfm\" (UniqueName: \"kubernetes.io/projected/d1cb425a-165a-4ba6-9316-3b8954b2b395-kube-api-access-4mxfm\") pod \"d1cb425a-165a-4ba6-9316-3b8954b2b395\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269903 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-combined-ca-bundle\") pod \"f1642d1b-1757-4936-973c-52bbc11672ea\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269972 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-log-httpd\") pod \"f1642d1b-1757-4936-973c-52bbc11672ea\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.269989 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-internal-tls-certs\") pod \"d1cb425a-165a-4ba6-9316-3b8954b2b395\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.270023 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-config-data\") pod \"f1642d1b-1757-4936-973c-52bbc11672ea\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.270046 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-run-httpd\") pod \"f1642d1b-1757-4936-973c-52bbc11672ea\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.270115 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-sg-core-conf-yaml\") pod \"f1642d1b-1757-4936-973c-52bbc11672ea\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.270139 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-scripts\") pod \"f1642d1b-1757-4936-973c-52bbc11672ea\" (UID: \"f1642d1b-1757-4936-973c-52bbc11672ea\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.270158 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-config-data\") pod \"d1cb425a-165a-4ba6-9316-3b8954b2b395\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.270204 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-logs\") pod \"d1cb425a-165a-4ba6-9316-3b8954b2b395\" (UID: \"d1cb425a-165a-4ba6-9316-3b8954b2b395\") " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.270834 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d1cb425a-165a-4ba6-9316-3b8954b2b395" (UID: "d1cb425a-165a-4ba6-9316-3b8954b2b395"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.270843 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-logs" (OuterVolumeSpecName: "logs") pod "d1cb425a-165a-4ba6-9316-3b8954b2b395" (UID: "d1cb425a-165a-4ba6-9316-3b8954b2b395"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.273712 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.274688 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f1642d1b-1757-4936-973c-52bbc11672ea" (UID: "f1642d1b-1757-4936-973c-52bbc11672ea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.275382 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f1642d1b-1757-4936-973c-52bbc11672ea" (UID: "f1642d1b-1757-4936-973c-52bbc11672ea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.281292 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "d1cb425a-165a-4ba6-9316-3b8954b2b395" (UID: "d1cb425a-165a-4ba6-9316-3b8954b2b395"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.281635 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1642d1b-1757-4936-973c-52bbc11672ea-kube-api-access-4rz7l" (OuterVolumeSpecName: "kube-api-access-4rz7l") pod "f1642d1b-1757-4936-973c-52bbc11672ea" (UID: "f1642d1b-1757-4936-973c-52bbc11672ea"). InnerVolumeSpecName "kube-api-access-4rz7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.281672 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1cb425a-165a-4ba6-9316-3b8954b2b395-kube-api-access-4mxfm" (OuterVolumeSpecName: "kube-api-access-4mxfm") pod "d1cb425a-165a-4ba6-9316-3b8954b2b395" (UID: "d1cb425a-165a-4ba6-9316-3b8954b2b395"). InnerVolumeSpecName "kube-api-access-4mxfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.283395 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-scripts" (OuterVolumeSpecName: "scripts") pod "f1642d1b-1757-4936-973c-52bbc11672ea" (UID: "f1642d1b-1757-4936-973c-52bbc11672ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.305704 4948 scope.go:117] "RemoveContainer" containerID="f48600b73efa978b634d7b269e3abe48151d3ea30ac6e6b8f5e2b6f49bc2196f" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.320218 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-scripts" (OuterVolumeSpecName: "scripts") pod "d1cb425a-165a-4ba6-9316-3b8954b2b395" (UID: "d1cb425a-165a-4ba6-9316-3b8954b2b395"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.339053 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f1642d1b-1757-4936-973c-52bbc11672ea" (UID: "f1642d1b-1757-4936-973c-52bbc11672ea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.339858 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1cb425a-165a-4ba6-9316-3b8954b2b395" (UID: "d1cb425a-165a-4ba6-9316-3b8954b2b395"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.346099 4948 scope.go:117] "RemoveContainer" containerID="9a60739ff31490e058856809f63cd9767cc1b7d3ad856223e87883204b670f63" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.348132 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d1cb425a-165a-4ba6-9316-3b8954b2b395" (UID: "d1cb425a-165a-4ba6-9316-3b8954b2b395"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.372758 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.372903 4948 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.372980 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rz7l\" (UniqueName: \"kubernetes.io/projected/f1642d1b-1757-4936-973c-52bbc11672ea-kube-api-access-4rz7l\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.373072 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.374635 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mxfm\" (UniqueName: \"kubernetes.io/projected/d1cb425a-165a-4ba6-9316-3b8954b2b395-kube-api-access-4mxfm\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.374727 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.374811 4948 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.374886 4948 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1642d1b-1757-4936-973c-52bbc11672ea-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.374959 4948 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.375117 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.375196 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1cb425a-165a-4ba6-9316-3b8954b2b395-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.375279 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.377480 4948 scope.go:117] "RemoveContainer" containerID="ca448219170cf74e4c666ed20421a892937f95be03019f5a005dac72be5e7661" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.404177 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-z2hx2"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.424660 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.438472 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-config-data" (OuterVolumeSpecName: "config-data") pod "d1cb425a-165a-4ba6-9316-3b8954b2b395" (UID: "d1cb425a-165a-4ba6-9316-3b8954b2b395"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.447627 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1642d1b-1757-4936-973c-52bbc11672ea" (UID: "f1642d1b-1757-4936-973c-52bbc11672ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.475321 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a2da-account-create-update-5wtsm"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.482833 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.482866 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1cb425a-165a-4ba6-9316-3b8954b2b395-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.482876 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.485789 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-de93-account-create-update-d8v6q"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.492903 4948 scope.go:117] "RemoveContainer" containerID="ac53562bbf54166812bd7a0b73efda3bf8f6f97111f3ec2e67b80d73e34962d8" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.518902 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jwmbk"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.532531 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-config-data" (OuterVolumeSpecName: "config-data") pod "f1642d1b-1757-4936-973c-52bbc11672ea" (UID: "f1642d1b-1757-4936-973c-52bbc11672ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.584481 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1642d1b-1757-4936-973c-52bbc11672ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.674925 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e276-account-create-update-99s2v"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.692545 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.775502 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.797007 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:58:09 crc kubenswrapper[4948]: E1204 17:58:09.800159 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="proxy-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.800187 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="proxy-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: E1204 17:58:09.801024 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="ceilometer-central-agent" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.801079 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="ceilometer-central-agent" Dec 04 17:58:09 crc kubenswrapper[4948]: E1204 17:58:09.801143 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.801150 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: E1204 17:58:09.801181 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="sg-core" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.801187 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="sg-core" Dec 04 17:58:09 crc kubenswrapper[4948]: E1204 17:58:09.801211 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-log" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.801219 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-log" Dec 04 17:58:09 crc kubenswrapper[4948]: E1204 17:58:09.801252 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="ceilometer-notification-agent" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.801262 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="ceilometer-notification-agent" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.802003 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="ceilometer-central-agent" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.802047 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="sg-core" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.802082 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-log" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.802098 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="ceilometer-notification-agent" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.802110 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" containerName="proxy-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.802128 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" containerName="glance-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.812067 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.825135 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.826965 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.827144 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.897492 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.906705 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.918130 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.918441 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.918743 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerName="glance-log" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.918777 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerName="glance-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: E1204 17:58:09.918997 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerName="glance-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.919017 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerName="glance-httpd" Dec 04 17:58:09 crc kubenswrapper[4948]: E1204 17:58:09.919038 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerName="glance-log" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.919045 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerName="glance-log" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.920744 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.922554 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.930937 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:09 crc kubenswrapper[4948]: I1204 17:58:09.955916 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.002018 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.002141 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7k4t\" (UniqueName: \"kubernetes.io/projected/c881bee3-e2f3-4da4-a12f-00db430e4323-kube-api-access-j7k4t\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.002176 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.002204 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.002238 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-logs\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.002284 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.002310 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.002335 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.102906 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxrhk\" (UniqueName: \"kubernetes.io/projected/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-kube-api-access-lxrhk\") pod \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.103820 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-httpd-run\") pod \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.103974 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-logs\") pod \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.104135 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-combined-ca-bundle\") pod \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.104263 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-scripts\") pod \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.104356 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-public-tls-certs\") pod \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.104476 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.104589 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-config-data\") pod \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\" (UID: \"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e\") " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.104743 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" (UID: "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.104971 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105065 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105171 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-logs\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105295 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-scripts\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105382 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-log-httpd\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105477 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105574 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105678 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105782 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-run-httpd\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105931 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p592h\" (UniqueName: \"kubernetes.io/projected/56024532-58d7-4eeb-b81a-332e60240238-kube-api-access-p592h\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.106071 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-config-data\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.106173 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.106249 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.106370 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7k4t\" (UniqueName: \"kubernetes.io/projected/c881bee3-e2f3-4da4-a12f-00db430e4323-kube-api-access-j7k4t\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.106442 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.106550 4948 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.106620 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.105613 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.106686 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-logs\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.107683 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-logs" (OuterVolumeSpecName: "logs") pod "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" (UID: "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.111949 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.112268 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-kube-api-access-lxrhk" (OuterVolumeSpecName: "kube-api-access-lxrhk") pod "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" (UID: "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e"). InnerVolumeSpecName "kube-api-access-lxrhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.113440 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.113483 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.115281 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.125394 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" (UID: "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.127618 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-scripts" (OuterVolumeSpecName: "scripts") pod "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" (UID: "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.133342 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7k4t\" (UniqueName: \"kubernetes.io/projected/c881bee3-e2f3-4da4-a12f-00db430e4323-kube-api-access-j7k4t\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.147759 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" (UID: "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.153573 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.204676 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" (UID: "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.206196 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-config-data" (OuterVolumeSpecName: "config-data") pod "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" (UID: "cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.207859 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.207937 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.207984 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-scripts\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.208008 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-log-httpd\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.208562 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-log-httpd\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.208853 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-run-httpd\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209301 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-run-httpd\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209373 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p592h\" (UniqueName: \"kubernetes.io/projected/56024532-58d7-4eeb-b81a-332e60240238-kube-api-access-p592h\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209428 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-config-data\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209515 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209528 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209538 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxrhk\" (UniqueName: \"kubernetes.io/projected/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-kube-api-access-lxrhk\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209549 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209559 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209566 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.209574 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.211701 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.213485 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-config-data\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.213877 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.215715 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.220026 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-scripts\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.224144 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p592h\" (UniqueName: \"kubernetes.io/projected/56024532-58d7-4eeb-b81a-332e60240238-kube-api-access-p592h\") pod \"ceilometer-0\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.231323 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.242490 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e276-account-create-update-99s2v" event={"ID":"ff5dc382-7aaa-4191-8605-dd03299ca26d","Type":"ContainerStarted","Data":"6a4f7ca8f85f0af89c130061e5e621f92b84e9721910c5c82f119b7a393b489d"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.242542 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e276-account-create-update-99s2v" event={"ID":"ff5dc382-7aaa-4191-8605-dd03299ca26d","Type":"ContainerStarted","Data":"c98068406da6510912b355f91c379ba4d1ca7085d3358836e07981f3e13a90c0"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.255309 4948 generic.go:334] "Generic (PLEG): container finished" podID="9fcef00f-3c5c-478a-a9b4-39c07f98ff69" containerID="d9ba434bb1ea4732ea4f9fd9d3132ecc30802fef6a73cbadd99f622f46b76466" exitCode=0 Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.255394 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-snxlb" event={"ID":"9fcef00f-3c5c-478a-a9b4-39c07f98ff69","Type":"ContainerDied","Data":"d9ba434bb1ea4732ea4f9fd9d3132ecc30802fef6a73cbadd99f622f46b76466"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.255420 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-snxlb" event={"ID":"9fcef00f-3c5c-478a-a9b4-39c07f98ff69","Type":"ContainerStarted","Data":"4ffb5acb45e402b432ac7bafb3051fa30e4af5b57703cd6b37be5d75bc4dbcbc"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.256290 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-e276-account-create-update-99s2v" podStartSLOduration=4.256279041 podStartE2EDuration="4.256279041s" podCreationTimestamp="2025-12-04 17:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:10.254985239 +0000 UTC m=+1901.616059641" watchObservedRunningTime="2025-12-04 17:58:10.256279041 +0000 UTC m=+1901.617353443" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.257253 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.263542 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jwmbk" event={"ID":"e083d908-b647-4875-8ae1-d455db250897","Type":"ContainerStarted","Data":"7f8cd0c6abb5d7335ee66173e4a80074451a3871e27a24379b76520ba90371c4"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.263585 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jwmbk" event={"ID":"e083d908-b647-4875-8ae1-d455db250897","Type":"ContainerStarted","Data":"0157819b933a0782316884143df91f3c8937a797ce99c508176882a42d72d1f1"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.269928 4948 generic.go:334] "Generic (PLEG): container finished" podID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" containerID="8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff" exitCode=0 Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.269992 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e","Type":"ContainerDied","Data":"8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.270019 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e","Type":"ContainerDied","Data":"70ff5ceaaddaf4edf298be3e78cb83837655b116f6fe50a7a47025e0a2c02d89"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.270037 4948 scope.go:117] "RemoveContainer" containerID="8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.270138 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.277693 4948 generic.go:334] "Generic (PLEG): container finished" podID="4c5917bc-97e7-4fa9-b727-c503d616e67f" containerID="6d7969f01db55d8e0829906eddc67bec05bbf6201f3fa18fbe56db4e90c70181" exitCode=0 Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.277773 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" event={"ID":"4c5917bc-97e7-4fa9-b727-c503d616e67f","Type":"ContainerDied","Data":"6d7969f01db55d8e0829906eddc67bec05bbf6201f3fa18fbe56db4e90c70181"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.277804 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" event={"ID":"4c5917bc-97e7-4fa9-b727-c503d616e67f","Type":"ContainerStarted","Data":"d92566a0f97701b3ccfab85344b2b3eb8b30bca28a4a393d6fadef2b9cc02dc3"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.287172 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-jwmbk" podStartSLOduration=5.287146094 podStartE2EDuration="5.287146094s" podCreationTimestamp="2025-12-04 17:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:10.283664479 +0000 UTC m=+1901.644738881" watchObservedRunningTime="2025-12-04 17:58:10.287146094 +0000 UTC m=+1901.648220496" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.291117 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-de93-account-create-update-d8v6q" event={"ID":"7744e322-879f-4483-b49e-019fc53973f5","Type":"ContainerStarted","Data":"7d9b8f23ae2ef512cdc6ed34a56f990104bebe106cd1661d09bdc9adbcefb842"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.291168 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-de93-account-create-update-d8v6q" event={"ID":"7744e322-879f-4483-b49e-019fc53973f5","Type":"ContainerStarted","Data":"4861adbe0cb0ecbb714355a1907d18c9a15ef3a3956ff0c93981a4cea6de438b"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.297251 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z2hx2" event={"ID":"dfa89602-ba65-4bc5-90d0-c91e6be39d1e","Type":"ContainerStarted","Data":"cb54ba484742820a90bdd62eb825c4a750eea731a1479ae1d68670d2cb64bf30"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.297299 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z2hx2" event={"ID":"dfa89602-ba65-4bc5-90d0-c91e6be39d1e","Type":"ContainerStarted","Data":"798fac3c2b50ceb1b3801cc8f5168cd835dbc24fe07f58bd385cc53e58f5b8b9"} Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.310935 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.579553 4948 scope.go:117] "RemoveContainer" containerID="77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.591602 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.609576 4948 scope.go:117] "RemoveContainer" containerID="8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff" Dec 04 17:58:10 crc kubenswrapper[4948]: E1204 17:58:10.610229 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff\": container with ID starting with 8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff not found: ID does not exist" containerID="8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.610264 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff"} err="failed to get container status \"8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff\": rpc error: code = NotFound desc = could not find container \"8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff\": container with ID starting with 8c9003bfa92d98c7f97954cc8241a83e626d97929d012a0b0886b9683aa79dff not found: ID does not exist" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.610288 4948 scope.go:117] "RemoveContainer" containerID="77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.610512 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:58:10 crc kubenswrapper[4948]: E1204 17:58:10.610588 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0\": container with ID starting with 77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0 not found: ID does not exist" containerID="77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.610610 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0"} err="failed to get container status \"77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0\": rpc error: code = NotFound desc = could not find container \"77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0\": container with ID starting with 77d65fda255b399eb3373696134ae8491cfb6cd6511473485c0f13152868f4f0 not found: ID does not exist" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.631936 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.633533 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.648950 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.649213 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.656655 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.719620 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.719677 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.719704 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5b7z\" (UniqueName: \"kubernetes.io/projected/0c08574c-af0f-4e7c-81af-b180b29ce4ee-kube-api-access-z5b7z\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.719735 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-logs\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.719764 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.719805 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.719902 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.719948 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.812907 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.821671 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.821717 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5b7z\" (UniqueName: \"kubernetes.io/projected/0c08574c-af0f-4e7c-81af-b180b29ce4ee-kube-api-access-z5b7z\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.821740 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-logs\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.821762 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.821847 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.821925 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.821964 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.822009 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.822578 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.824905 4948 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.825495 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-logs\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.829002 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.830161 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.842592 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.851668 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.863277 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5b7z\" (UniqueName: \"kubernetes.io/projected/0c08574c-af0f-4e7c-81af-b180b29ce4ee-kube-api-access-z5b7z\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.870865 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.871507 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " pod="openstack/glance-default-external-api-0" Dec 04 17:58:10 crc kubenswrapper[4948]: W1204 17:58:10.889298 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc881bee3_e2f3_4da4_a12f_00db430e4323.slice/crio-7861ac1bfcdbf0b6ec2259eab46c686bd03071987d27e20ad8f3c96fb090246f WatchSource:0}: Error finding container 7861ac1bfcdbf0b6ec2259eab46c686bd03071987d27e20ad8f3c96fb090246f: Status 404 returned error can't find the container with id 7861ac1bfcdbf0b6ec2259eab46c686bd03071987d27e20ad8f3c96fb090246f Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.934405 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e" path="/var/lib/kubelet/pods/cf6e1e9b-1f4a-46e4-80e7-3acd4993b48e/volumes" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.935241 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1cb425a-165a-4ba6-9316-3b8954b2b395" path="/var/lib/kubelet/pods/d1cb425a-165a-4ba6-9316-3b8954b2b395/volumes" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.936667 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1642d1b-1757-4936-973c-52bbc11672ea" path="/var/lib/kubelet/pods/f1642d1b-1757-4936-973c-52bbc11672ea/volumes" Dec 04 17:58:10 crc kubenswrapper[4948]: I1204 17:58:10.959097 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.330009 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerStarted","Data":"ff062965c143b3e4583d2e2b7ea692008d384cb9a1d7bd6300d755abfdf4b7bf"} Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.332216 4948 generic.go:334] "Generic (PLEG): container finished" podID="ff5dc382-7aaa-4191-8605-dd03299ca26d" containerID="6a4f7ca8f85f0af89c130061e5e621f92b84e9721910c5c82f119b7a393b489d" exitCode=0 Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.332282 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e276-account-create-update-99s2v" event={"ID":"ff5dc382-7aaa-4191-8605-dd03299ca26d","Type":"ContainerDied","Data":"6a4f7ca8f85f0af89c130061e5e621f92b84e9721910c5c82f119b7a393b489d"} Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.335905 4948 generic.go:334] "Generic (PLEG): container finished" podID="dfa89602-ba65-4bc5-90d0-c91e6be39d1e" containerID="cb54ba484742820a90bdd62eb825c4a750eea731a1479ae1d68670d2cb64bf30" exitCode=0 Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.335965 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z2hx2" event={"ID":"dfa89602-ba65-4bc5-90d0-c91e6be39d1e","Type":"ContainerDied","Data":"cb54ba484742820a90bdd62eb825c4a750eea731a1479ae1d68670d2cb64bf30"} Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.342424 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c881bee3-e2f3-4da4-a12f-00db430e4323","Type":"ContainerStarted","Data":"7861ac1bfcdbf0b6ec2259eab46c686bd03071987d27e20ad8f3c96fb090246f"} Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.349063 4948 generic.go:334] "Generic (PLEG): container finished" podID="e083d908-b647-4875-8ae1-d455db250897" containerID="7f8cd0c6abb5d7335ee66173e4a80074451a3871e27a24379b76520ba90371c4" exitCode=0 Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.349133 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jwmbk" event={"ID":"e083d908-b647-4875-8ae1-d455db250897","Type":"ContainerDied","Data":"7f8cd0c6abb5d7335ee66173e4a80074451a3871e27a24379b76520ba90371c4"} Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.356994 4948 generic.go:334] "Generic (PLEG): container finished" podID="7744e322-879f-4483-b49e-019fc53973f5" containerID="7d9b8f23ae2ef512cdc6ed34a56f990104bebe106cd1661d09bdc9adbcefb842" exitCode=0 Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.357047 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-de93-account-create-update-d8v6q" event={"ID":"7744e322-879f-4483-b49e-019fc53973f5","Type":"ContainerDied","Data":"7d9b8f23ae2ef512cdc6ed34a56f990104bebe106cd1661d09bdc9adbcefb842"} Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.716853 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 17:58:11 crc kubenswrapper[4948]: I1204 17:58:11.980424 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.053541 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5917bc-97e7-4fa9-b727-c503d616e67f-operator-scripts\") pod \"4c5917bc-97e7-4fa9-b727-c503d616e67f\" (UID: \"4c5917bc-97e7-4fa9-b727-c503d616e67f\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.053659 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md284\" (UniqueName: \"kubernetes.io/projected/4c5917bc-97e7-4fa9-b727-c503d616e67f-kube-api-access-md284\") pod \"4c5917bc-97e7-4fa9-b727-c503d616e67f\" (UID: \"4c5917bc-97e7-4fa9-b727-c503d616e67f\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.055823 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5917bc-97e7-4fa9-b727-c503d616e67f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c5917bc-97e7-4fa9-b727-c503d616e67f" (UID: "4c5917bc-97e7-4fa9-b727-c503d616e67f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.078455 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5917bc-97e7-4fa9-b727-c503d616e67f-kube-api-access-md284" (OuterVolumeSpecName: "kube-api-access-md284") pod "4c5917bc-97e7-4fa9-b727-c503d616e67f" (UID: "4c5917bc-97e7-4fa9-b727-c503d616e67f"). InnerVolumeSpecName "kube-api-access-md284". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.155346 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c5917bc-97e7-4fa9-b727-c503d616e67f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.155603 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md284\" (UniqueName: \"kubernetes.io/projected/4c5917bc-97e7-4fa9-b727-c503d616e67f-kube-api-access-md284\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.156313 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.189644 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.195215 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.256557 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7744e322-879f-4483-b49e-019fc53973f5-operator-scripts\") pod \"7744e322-879f-4483-b49e-019fc53973f5\" (UID: \"7744e322-879f-4483-b49e-019fc53973f5\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.256630 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzvtj\" (UniqueName: \"kubernetes.io/projected/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-kube-api-access-qzvtj\") pod \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\" (UID: \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.256711 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltl55\" (UniqueName: \"kubernetes.io/projected/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-kube-api-access-ltl55\") pod \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\" (UID: \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.256774 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-operator-scripts\") pod \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\" (UID: \"9fcef00f-3c5c-478a-a9b4-39c07f98ff69\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.256861 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dmzk\" (UniqueName: \"kubernetes.io/projected/7744e322-879f-4483-b49e-019fc53973f5-kube-api-access-9dmzk\") pod \"7744e322-879f-4483-b49e-019fc53973f5\" (UID: \"7744e322-879f-4483-b49e-019fc53973f5\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.256912 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-operator-scripts\") pod \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\" (UID: \"dfa89602-ba65-4bc5-90d0-c91e6be39d1e\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.258049 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9fcef00f-3c5c-478a-a9b4-39c07f98ff69" (UID: "9fcef00f-3c5c-478a-a9b4-39c07f98ff69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.258695 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7744e322-879f-4483-b49e-019fc53973f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7744e322-879f-4483-b49e-019fc53973f5" (UID: "7744e322-879f-4483-b49e-019fc53973f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.259025 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dfa89602-ba65-4bc5-90d0-c91e6be39d1e" (UID: "dfa89602-ba65-4bc5-90d0-c91e6be39d1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.261443 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7744e322-879f-4483-b49e-019fc53973f5-kube-api-access-9dmzk" (OuterVolumeSpecName: "kube-api-access-9dmzk") pod "7744e322-879f-4483-b49e-019fc53973f5" (UID: "7744e322-879f-4483-b49e-019fc53973f5"). InnerVolumeSpecName "kube-api-access-9dmzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.264709 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-kube-api-access-qzvtj" (OuterVolumeSpecName: "kube-api-access-qzvtj") pod "dfa89602-ba65-4bc5-90d0-c91e6be39d1e" (UID: "dfa89602-ba65-4bc5-90d0-c91e6be39d1e"). InnerVolumeSpecName "kube-api-access-qzvtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.266821 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-kube-api-access-ltl55" (OuterVolumeSpecName: "kube-api-access-ltl55") pod "9fcef00f-3c5c-478a-a9b4-39c07f98ff69" (UID: "9fcef00f-3c5c-478a-a9b4-39c07f98ff69"). InnerVolumeSpecName "kube-api-access-ltl55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.358730 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dmzk\" (UniqueName: \"kubernetes.io/projected/7744e322-879f-4483-b49e-019fc53973f5-kube-api-access-9dmzk\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.358767 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.358776 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7744e322-879f-4483-b49e-019fc53973f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.358784 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzvtj\" (UniqueName: \"kubernetes.io/projected/dfa89602-ba65-4bc5-90d0-c91e6be39d1e-kube-api-access-qzvtj\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.358793 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltl55\" (UniqueName: \"kubernetes.io/projected/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-kube-api-access-ltl55\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.358801 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fcef00f-3c5c-478a-a9b4-39c07f98ff69-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.370954 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerStarted","Data":"b8451c92fb6d5b8a27da9692927f3abdfe1fbba623ef7211823931c5f150a270"} Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.372270 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" event={"ID":"4c5917bc-97e7-4fa9-b727-c503d616e67f","Type":"ContainerDied","Data":"d92566a0f97701b3ccfab85344b2b3eb8b30bca28a4a393d6fadef2b9cc02dc3"} Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.372294 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92566a0f97701b3ccfab85344b2b3eb8b30bca28a4a393d6fadef2b9cc02dc3" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.372349 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a2da-account-create-update-5wtsm" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.380882 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-z2hx2" event={"ID":"dfa89602-ba65-4bc5-90d0-c91e6be39d1e","Type":"ContainerDied","Data":"798fac3c2b50ceb1b3801cc8f5168cd835dbc24fe07f58bd385cc53e58f5b8b9"} Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.380922 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="798fac3c2b50ceb1b3801cc8f5168cd835dbc24fe07f58bd385cc53e58f5b8b9" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.380980 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-z2hx2" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.396792 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c881bee3-e2f3-4da4-a12f-00db430e4323","Type":"ContainerStarted","Data":"b058b84e4f67a262a8cae930973840aaf5fda1c3dfc929a04a6794fb308c7d61"} Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.401619 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-snxlb" event={"ID":"9fcef00f-3c5c-478a-a9b4-39c07f98ff69","Type":"ContainerDied","Data":"4ffb5acb45e402b432ac7bafb3051fa30e4af5b57703cd6b37be5d75bc4dbcbc"} Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.401656 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ffb5acb45e402b432ac7bafb3051fa30e4af5b57703cd6b37be5d75bc4dbcbc" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.401714 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-snxlb" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.407651 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c08574c-af0f-4e7c-81af-b180b29ce4ee","Type":"ContainerStarted","Data":"913bf6bea6ca98aae7f0b32a0e3216f8310c0c695066ffe8b7e004fd33427427"} Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.413827 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-de93-account-create-update-d8v6q" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.414121 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-de93-account-create-update-d8v6q" event={"ID":"7744e322-879f-4483-b49e-019fc53973f5","Type":"ContainerDied","Data":"4861adbe0cb0ecbb714355a1907d18c9a15ef3a3956ff0c93981a4cea6de438b"} Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.414178 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4861adbe0cb0ecbb714355a1907d18c9a15ef3a3956ff0c93981a4cea6de438b" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.816611 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.856395 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.874183 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjt75\" (UniqueName: \"kubernetes.io/projected/ff5dc382-7aaa-4191-8605-dd03299ca26d-kube-api-access-wjt75\") pod \"ff5dc382-7aaa-4191-8605-dd03299ca26d\" (UID: \"ff5dc382-7aaa-4191-8605-dd03299ca26d\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.874478 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5dc382-7aaa-4191-8605-dd03299ca26d-operator-scripts\") pod \"ff5dc382-7aaa-4191-8605-dd03299ca26d\" (UID: \"ff5dc382-7aaa-4191-8605-dd03299ca26d\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.875438 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff5dc382-7aaa-4191-8605-dd03299ca26d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff5dc382-7aaa-4191-8605-dd03299ca26d" (UID: "ff5dc382-7aaa-4191-8605-dd03299ca26d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.880317 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5dc382-7aaa-4191-8605-dd03299ca26d-kube-api-access-wjt75" (OuterVolumeSpecName: "kube-api-access-wjt75") pod "ff5dc382-7aaa-4191-8605-dd03299ca26d" (UID: "ff5dc382-7aaa-4191-8605-dd03299ca26d"). InnerVolumeSpecName "kube-api-access-wjt75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.975787 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rmg6\" (UniqueName: \"kubernetes.io/projected/e083d908-b647-4875-8ae1-d455db250897-kube-api-access-7rmg6\") pod \"e083d908-b647-4875-8ae1-d455db250897\" (UID: \"e083d908-b647-4875-8ae1-d455db250897\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.976138 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e083d908-b647-4875-8ae1-d455db250897-operator-scripts\") pod \"e083d908-b647-4875-8ae1-d455db250897\" (UID: \"e083d908-b647-4875-8ae1-d455db250897\") " Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.976731 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff5dc382-7aaa-4191-8605-dd03299ca26d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.976749 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjt75\" (UniqueName: \"kubernetes.io/projected/ff5dc382-7aaa-4191-8605-dd03299ca26d-kube-api-access-wjt75\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.977533 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e083d908-b647-4875-8ae1-d455db250897-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e083d908-b647-4875-8ae1-d455db250897" (UID: "e083d908-b647-4875-8ae1-d455db250897"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:58:12 crc kubenswrapper[4948]: I1204 17:58:12.981217 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e083d908-b647-4875-8ae1-d455db250897-kube-api-access-7rmg6" (OuterVolumeSpecName: "kube-api-access-7rmg6") pod "e083d908-b647-4875-8ae1-d455db250897" (UID: "e083d908-b647-4875-8ae1-d455db250897"). InnerVolumeSpecName "kube-api-access-7rmg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.079354 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rmg6\" (UniqueName: \"kubernetes.io/projected/e083d908-b647-4875-8ae1-d455db250897-kube-api-access-7rmg6\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.079407 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e083d908-b647-4875-8ae1-d455db250897-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.424209 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c08574c-af0f-4e7c-81af-b180b29ce4ee","Type":"ContainerStarted","Data":"db8b9187d0c187cfc911c618a1e41befbb49ce369abd91c26b5274db741964ad"} Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.424470 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c08574c-af0f-4e7c-81af-b180b29ce4ee","Type":"ContainerStarted","Data":"4ae447a7f1fce2c6cfb74358e924d30d23ffd33d65e3ced21c1749bddfe8ce91"} Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.426260 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jwmbk" event={"ID":"e083d908-b647-4875-8ae1-d455db250897","Type":"ContainerDied","Data":"0157819b933a0782316884143df91f3c8937a797ce99c508176882a42d72d1f1"} Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.426283 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0157819b933a0782316884143df91f3c8937a797ce99c508176882a42d72d1f1" Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.426311 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jwmbk" Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.428571 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerStarted","Data":"835f45e08eb23b94d3e78247bd5cee0b439e72c8831f4bccfce4c1b82b2f4bf8"} Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.430327 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e276-account-create-update-99s2v" event={"ID":"ff5dc382-7aaa-4191-8605-dd03299ca26d","Type":"ContainerDied","Data":"c98068406da6510912b355f91c379ba4d1ca7085d3358836e07981f3e13a90c0"} Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.430371 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c98068406da6510912b355f91c379ba4d1ca7085d3358836e07981f3e13a90c0" Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.430351 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e276-account-create-update-99s2v" Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.432727 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c881bee3-e2f3-4da4-a12f-00db430e4323","Type":"ContainerStarted","Data":"3ed5978b64fee059b95b3f3fcb1a1ab665b53aab15fb25269bdf21eeb866ef81"} Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.458104 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.458086413 podStartE2EDuration="3.458086413s" podCreationTimestamp="2025-12-04 17:58:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:13.445163937 +0000 UTC m=+1904.806238339" watchObservedRunningTime="2025-12-04 17:58:13.458086413 +0000 UTC m=+1904.819160815" Dec 04 17:58:13 crc kubenswrapper[4948]: I1204 17:58:13.469387 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.469367128 podStartE2EDuration="4.469367128s" podCreationTimestamp="2025-12-04 17:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:13.467721638 +0000 UTC m=+1904.828796050" watchObservedRunningTime="2025-12-04 17:58:13.469367128 +0000 UTC m=+1904.830441530" Dec 04 17:58:14 crc kubenswrapper[4948]: I1204 17:58:14.443527 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerStarted","Data":"4bd1241f1da4e0a02c0a0ca8943a6ec07e111fac9423e13580158fad86d85130"} Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.187863 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xbz9m"] Dec 04 17:58:16 crc kubenswrapper[4948]: E1204 17:58:16.188938 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5917bc-97e7-4fa9-b727-c503d616e67f" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.188955 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5917bc-97e7-4fa9-b727-c503d616e67f" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: E1204 17:58:16.188971 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5dc382-7aaa-4191-8605-dd03299ca26d" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.188979 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5dc382-7aaa-4191-8605-dd03299ca26d" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: E1204 17:58:16.188989 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e083d908-b647-4875-8ae1-d455db250897" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.188997 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e083d908-b647-4875-8ae1-d455db250897" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: E1204 17:58:16.189017 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7744e322-879f-4483-b49e-019fc53973f5" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189025 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7744e322-879f-4483-b49e-019fc53973f5" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: E1204 17:58:16.189075 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa89602-ba65-4bc5-90d0-c91e6be39d1e" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189084 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa89602-ba65-4bc5-90d0-c91e6be39d1e" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: E1204 17:58:16.189100 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcef00f-3c5c-478a-a9b4-39c07f98ff69" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189108 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcef00f-3c5c-478a-a9b4-39c07f98ff69" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189318 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa89602-ba65-4bc5-90d0-c91e6be39d1e" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189343 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5dc382-7aaa-4191-8605-dd03299ca26d" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189364 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5917bc-97e7-4fa9-b727-c503d616e67f" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189378 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="7744e322-879f-4483-b49e-019fc53973f5" containerName="mariadb-account-create-update" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189392 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fcef00f-3c5c-478a-a9b4-39c07f98ff69" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.189400 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e083d908-b647-4875-8ae1-d455db250897" containerName="mariadb-database-create" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.190179 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.192563 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-g2l6n" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.193159 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.194223 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.208307 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xbz9m"] Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.235786 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-config-data\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.235842 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bwfj\" (UniqueName: \"kubernetes.io/projected/c9360a48-9890-45f3-8fc3-551ba8c1521e-kube-api-access-6bwfj\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.235931 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-scripts\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.236186 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.338003 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bwfj\" (UniqueName: \"kubernetes.io/projected/c9360a48-9890-45f3-8fc3-551ba8c1521e-kube-api-access-6bwfj\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.338097 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-scripts\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.338195 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.338294 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-config-data\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.342395 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-config-data\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.342477 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.342720 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-scripts\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.365583 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bwfj\" (UniqueName: \"kubernetes.io/projected/c9360a48-9890-45f3-8fc3-551ba8c1521e-kube-api-access-6bwfj\") pod \"nova-cell0-conductor-db-sync-xbz9m\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.465631 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerStarted","Data":"a5c3ab74ce72de8d6a0de418140b733edf631a1ac5a51b48081f538d6a77e68b"} Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.465902 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 17:58:16 crc kubenswrapper[4948]: I1204 17:58:16.545299 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:17 crc kubenswrapper[4948]: I1204 17:58:17.007336 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.494578144 podStartE2EDuration="8.007309241s" podCreationTimestamp="2025-12-04 17:58:09 +0000 UTC" firstStartedPulling="2025-12-04 17:58:10.815137887 +0000 UTC m=+1902.176212289" lastFinishedPulling="2025-12-04 17:58:15.327868974 +0000 UTC m=+1906.688943386" observedRunningTime="2025-12-04 17:58:16.495414891 +0000 UTC m=+1907.856489293" watchObservedRunningTime="2025-12-04 17:58:17.007309241 +0000 UTC m=+1908.368383653" Dec 04 17:58:17 crc kubenswrapper[4948]: W1204 17:58:17.008387 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9360a48_9890_45f3_8fc3_551ba8c1521e.slice/crio-d970e59e630e5f4063c4a91299a64a90989ce3306bfac4fe91fb4128b0600e93 WatchSource:0}: Error finding container d970e59e630e5f4063c4a91299a64a90989ce3306bfac4fe91fb4128b0600e93: Status 404 returned error can't find the container with id d970e59e630e5f4063c4a91299a64a90989ce3306bfac4fe91fb4128b0600e93 Dec 04 17:58:17 crc kubenswrapper[4948]: I1204 17:58:17.008658 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xbz9m"] Dec 04 17:58:17 crc kubenswrapper[4948]: I1204 17:58:17.485379 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" event={"ID":"c9360a48-9890-45f3-8fc3-551ba8c1521e","Type":"ContainerStarted","Data":"d970e59e630e5f4063c4a91299a64a90989ce3306bfac4fe91fb4128b0600e93"} Dec 04 17:58:19 crc kubenswrapper[4948]: I1204 17:58:19.872799 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:19 crc kubenswrapper[4948]: I1204 17:58:19.873348 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="ceilometer-central-agent" containerID="cri-o://b8451c92fb6d5b8a27da9692927f3abdfe1fbba623ef7211823931c5f150a270" gracePeriod=30 Dec 04 17:58:19 crc kubenswrapper[4948]: I1204 17:58:19.873393 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="proxy-httpd" containerID="cri-o://a5c3ab74ce72de8d6a0de418140b733edf631a1ac5a51b48081f538d6a77e68b" gracePeriod=30 Dec 04 17:58:19 crc kubenswrapper[4948]: I1204 17:58:19.873393 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="sg-core" containerID="cri-o://4bd1241f1da4e0a02c0a0ca8943a6ec07e111fac9423e13580158fad86d85130" gracePeriod=30 Dec 04 17:58:19 crc kubenswrapper[4948]: I1204 17:58:19.873443 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="ceilometer-notification-agent" containerID="cri-o://835f45e08eb23b94d3e78247bd5cee0b439e72c8831f4bccfce4c1b82b2f4bf8" gracePeriod=30 Dec 04 17:58:20 crc kubenswrapper[4948]: E1204 17:58:20.139532 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56024532_58d7_4eeb_b81a_332e60240238.slice/crio-4bd1241f1da4e0a02c0a0ca8943a6ec07e111fac9423e13580158fad86d85130.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56024532_58d7_4eeb_b81a_332e60240238.slice/crio-conmon-4bd1241f1da4e0a02c0a0ca8943a6ec07e111fac9423e13580158fad86d85130.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56024532_58d7_4eeb_b81a_332e60240238.slice/crio-a5c3ab74ce72de8d6a0de418140b733edf631a1ac5a51b48081f538d6a77e68b.scope\": RecentStats: unable to find data in memory cache]" Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.212248 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.212284 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.246788 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.282384 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.518132 4948 generic.go:334] "Generic (PLEG): container finished" podID="56024532-58d7-4eeb-b81a-332e60240238" containerID="a5c3ab74ce72de8d6a0de418140b733edf631a1ac5a51b48081f538d6a77e68b" exitCode=0 Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.518385 4948 generic.go:334] "Generic (PLEG): container finished" podID="56024532-58d7-4eeb-b81a-332e60240238" containerID="4bd1241f1da4e0a02c0a0ca8943a6ec07e111fac9423e13580158fad86d85130" exitCode=2 Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.518397 4948 generic.go:334] "Generic (PLEG): container finished" podID="56024532-58d7-4eeb-b81a-332e60240238" containerID="835f45e08eb23b94d3e78247bd5cee0b439e72c8831f4bccfce4c1b82b2f4bf8" exitCode=0 Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.518404 4948 generic.go:334] "Generic (PLEG): container finished" podID="56024532-58d7-4eeb-b81a-332e60240238" containerID="b8451c92fb6d5b8a27da9692927f3abdfe1fbba623ef7211823931c5f150a270" exitCode=0 Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.518859 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerDied","Data":"a5c3ab74ce72de8d6a0de418140b733edf631a1ac5a51b48081f538d6a77e68b"} Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.519303 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerDied","Data":"4bd1241f1da4e0a02c0a0ca8943a6ec07e111fac9423e13580158fad86d85130"} Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.519327 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.519337 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerDied","Data":"835f45e08eb23b94d3e78247bd5cee0b439e72c8831f4bccfce4c1b82b2f4bf8"} Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.519348 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.519620 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerDied","Data":"b8451c92fb6d5b8a27da9692927f3abdfe1fbba623ef7211823931c5f150a270"} Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.959627 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 17:58:20 crc kubenswrapper[4948]: I1204 17:58:20.959667 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 17:58:21 crc kubenswrapper[4948]: I1204 17:58:21.000930 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 17:58:21 crc kubenswrapper[4948]: I1204 17:58:21.020587 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 17:58:21 crc kubenswrapper[4948]: I1204 17:58:21.529186 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 17:58:21 crc kubenswrapper[4948]: I1204 17:58:21.529218 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 17:58:22 crc kubenswrapper[4948]: I1204 17:58:22.777683 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:22 crc kubenswrapper[4948]: I1204 17:58:22.778090 4948 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 17:58:22 crc kubenswrapper[4948]: I1204 17:58:22.779568 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 17:58:23 crc kubenswrapper[4948]: I1204 17:58:23.587369 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 17:58:23 crc kubenswrapper[4948]: I1204 17:58:23.587506 4948 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 17:58:23 crc kubenswrapper[4948]: I1204 17:58:23.603540 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.421389 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.548737 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-log-httpd\") pod \"56024532-58d7-4eeb-b81a-332e60240238\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.548833 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-combined-ca-bundle\") pod \"56024532-58d7-4eeb-b81a-332e60240238\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.548854 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-scripts\") pod \"56024532-58d7-4eeb-b81a-332e60240238\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.548898 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p592h\" (UniqueName: \"kubernetes.io/projected/56024532-58d7-4eeb-b81a-332e60240238-kube-api-access-p592h\") pod \"56024532-58d7-4eeb-b81a-332e60240238\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.548948 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-run-httpd\") pod \"56024532-58d7-4eeb-b81a-332e60240238\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.548984 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-sg-core-conf-yaml\") pod \"56024532-58d7-4eeb-b81a-332e60240238\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.549080 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-config-data\") pod \"56024532-58d7-4eeb-b81a-332e60240238\" (UID: \"56024532-58d7-4eeb-b81a-332e60240238\") " Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.549402 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "56024532-58d7-4eeb-b81a-332e60240238" (UID: "56024532-58d7-4eeb-b81a-332e60240238"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.549598 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "56024532-58d7-4eeb-b81a-332e60240238" (UID: "56024532-58d7-4eeb-b81a-332e60240238"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.554413 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56024532-58d7-4eeb-b81a-332e60240238-kube-api-access-p592h" (OuterVolumeSpecName: "kube-api-access-p592h") pod "56024532-58d7-4eeb-b81a-332e60240238" (UID: "56024532-58d7-4eeb-b81a-332e60240238"). InnerVolumeSpecName "kube-api-access-p592h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.555898 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-scripts" (OuterVolumeSpecName: "scripts") pod "56024532-58d7-4eeb-b81a-332e60240238" (UID: "56024532-58d7-4eeb-b81a-332e60240238"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.567234 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.567717 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56024532-58d7-4eeb-b81a-332e60240238","Type":"ContainerDied","Data":"ff062965c143b3e4583d2e2b7ea692008d384cb9a1d7bd6300d755abfdf4b7bf"} Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.567862 4948 scope.go:117] "RemoveContainer" containerID="a5c3ab74ce72de8d6a0de418140b733edf631a1ac5a51b48081f538d6a77e68b" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.571107 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" event={"ID":"c9360a48-9890-45f3-8fc3-551ba8c1521e","Type":"ContainerStarted","Data":"9624c3c7c396b5a65af21a0e1c05976f033b2146793a0b81e5ffc20f4e076487"} Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.598702 4948 scope.go:117] "RemoveContainer" containerID="4bd1241f1da4e0a02c0a0ca8943a6ec07e111fac9423e13580158fad86d85130" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.598740 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" podStartSLOduration=1.3827072 podStartE2EDuration="9.598720574s" podCreationTimestamp="2025-12-04 17:58:16 +0000 UTC" firstStartedPulling="2025-12-04 17:58:17.011391901 +0000 UTC m=+1908.372466303" lastFinishedPulling="2025-12-04 17:58:25.227405275 +0000 UTC m=+1916.588479677" observedRunningTime="2025-12-04 17:58:25.593484136 +0000 UTC m=+1916.954558558" watchObservedRunningTime="2025-12-04 17:58:25.598720574 +0000 UTC m=+1916.959794966" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.601718 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "56024532-58d7-4eeb-b81a-332e60240238" (UID: "56024532-58d7-4eeb-b81a-332e60240238"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.621986 4948 scope.go:117] "RemoveContainer" containerID="835f45e08eb23b94d3e78247bd5cee0b439e72c8831f4bccfce4c1b82b2f4bf8" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.641310 4948 scope.go:117] "RemoveContainer" containerID="b8451c92fb6d5b8a27da9692927f3abdfe1fbba623ef7211823931c5f150a270" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.650972 4948 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.650997 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.651007 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p592h\" (UniqueName: \"kubernetes.io/projected/56024532-58d7-4eeb-b81a-332e60240238-kube-api-access-p592h\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.651016 4948 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56024532-58d7-4eeb-b81a-332e60240238-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.651025 4948 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.660679 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56024532-58d7-4eeb-b81a-332e60240238" (UID: "56024532-58d7-4eeb-b81a-332e60240238"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.682314 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-config-data" (OuterVolumeSpecName: "config-data") pod "56024532-58d7-4eeb-b81a-332e60240238" (UID: "56024532-58d7-4eeb-b81a-332e60240238"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.752360 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.752397 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56024532-58d7-4eeb-b81a-332e60240238-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.912775 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.922856 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.952193 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:25 crc kubenswrapper[4948]: E1204 17:58:25.953540 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="ceilometer-notification-agent" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.953589 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="ceilometer-notification-agent" Dec 04 17:58:25 crc kubenswrapper[4948]: E1204 17:58:25.953609 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="ceilometer-central-agent" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.953617 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="ceilometer-central-agent" Dec 04 17:58:25 crc kubenswrapper[4948]: E1204 17:58:25.953653 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="sg-core" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.953662 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="sg-core" Dec 04 17:58:25 crc kubenswrapper[4948]: E1204 17:58:25.953675 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="proxy-httpd" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.953682 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="proxy-httpd" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.953905 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="sg-core" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.953920 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="ceilometer-central-agent" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.953931 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="ceilometer-notification-agent" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.953948 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="56024532-58d7-4eeb-b81a-332e60240238" containerName="proxy-httpd" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.957730 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.959993 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.961388 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 17:58:25 crc kubenswrapper[4948]: I1204 17:58:25.974175 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.057183 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.057452 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.057584 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-log-httpd\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.057788 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddrxq\" (UniqueName: \"kubernetes.io/projected/356cc75c-7afc-4735-ad87-1a80fd0317ff-kube-api-access-ddrxq\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.057845 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-run-httpd\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.057915 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-scripts\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.058024 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-config-data\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.159843 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddrxq\" (UniqueName: \"kubernetes.io/projected/356cc75c-7afc-4735-ad87-1a80fd0317ff-kube-api-access-ddrxq\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.159895 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-run-httpd\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.159919 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-scripts\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.159961 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-config-data\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.159991 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.160093 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.160568 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-run-httpd\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.160805 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-log-httpd\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.161186 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-log-httpd\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.163776 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.164342 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.184111 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-config-data\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.187486 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-scripts\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.189028 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddrxq\" (UniqueName: \"kubernetes.io/projected/356cc75c-7afc-4735-ad87-1a80fd0317ff-kube-api-access-ddrxq\") pod \"ceilometer-0\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.274458 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.756697 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:26 crc kubenswrapper[4948]: I1204 17:58:26.936407 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56024532-58d7-4eeb-b81a-332e60240238" path="/var/lib/kubelet/pods/56024532-58d7-4eeb-b81a-332e60240238/volumes" Dec 04 17:58:27 crc kubenswrapper[4948]: I1204 17:58:27.595637 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerStarted","Data":"1fe66982e0e16abfd0746da1294a7911efa6d671fed2c7d0ca82493dcb71d88e"} Dec 04 17:58:27 crc kubenswrapper[4948]: I1204 17:58:27.595907 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerStarted","Data":"6a86a1aa9b52660ddea6d595aded9af9ad358983a00d21ad479e90728ecbd2c0"} Dec 04 17:58:28 crc kubenswrapper[4948]: I1204 17:58:28.605927 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerStarted","Data":"3dfc514c6e426cb101de9791936ede5dd2696e9f714efe839e8d0318b4422575"} Dec 04 17:58:29 crc kubenswrapper[4948]: I1204 17:58:29.618641 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerStarted","Data":"e73ef79d3af299f20add2e7b2ec2214ddcb364c22e22354f52021597d1cd69a0"} Dec 04 17:58:29 crc kubenswrapper[4948]: I1204 17:58:29.959215 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:31 crc kubenswrapper[4948]: I1204 17:58:31.642515 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerStarted","Data":"0e4b4f08bdf858474b8cb10f21f7d3641da527593681765bd9b01a6b0300a56e"} Dec 04 17:58:33 crc kubenswrapper[4948]: I1204 17:58:33.658128 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="ceilometer-central-agent" containerID="cri-o://1fe66982e0e16abfd0746da1294a7911efa6d671fed2c7d0ca82493dcb71d88e" gracePeriod=30 Dec 04 17:58:33 crc kubenswrapper[4948]: I1204 17:58:33.658384 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 17:58:33 crc kubenswrapper[4948]: I1204 17:58:33.658602 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="proxy-httpd" containerID="cri-o://0e4b4f08bdf858474b8cb10f21f7d3641da527593681765bd9b01a6b0300a56e" gracePeriod=30 Dec 04 17:58:33 crc kubenswrapper[4948]: I1204 17:58:33.658703 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="ceilometer-notification-agent" containerID="cri-o://3dfc514c6e426cb101de9791936ede5dd2696e9f714efe839e8d0318b4422575" gracePeriod=30 Dec 04 17:58:33 crc kubenswrapper[4948]: I1204 17:58:33.658758 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="sg-core" containerID="cri-o://e73ef79d3af299f20add2e7b2ec2214ddcb364c22e22354f52021597d1cd69a0" gracePeriod=30 Dec 04 17:58:33 crc kubenswrapper[4948]: I1204 17:58:33.696961 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.7555822370000005 podStartE2EDuration="8.696939393s" podCreationTimestamp="2025-12-04 17:58:25 +0000 UTC" firstStartedPulling="2025-12-04 17:58:26.750263981 +0000 UTC m=+1918.111338383" lastFinishedPulling="2025-12-04 17:58:29.691621137 +0000 UTC m=+1921.052695539" observedRunningTime="2025-12-04 17:58:33.692407592 +0000 UTC m=+1925.053482004" watchObservedRunningTime="2025-12-04 17:58:33.696939393 +0000 UTC m=+1925.058013785" Dec 04 17:58:34 crc kubenswrapper[4948]: I1204 17:58:34.671694 4948 generic.go:334] "Generic (PLEG): container finished" podID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerID="0e4b4f08bdf858474b8cb10f21f7d3641da527593681765bd9b01a6b0300a56e" exitCode=0 Dec 04 17:58:34 crc kubenswrapper[4948]: I1204 17:58:34.672011 4948 generic.go:334] "Generic (PLEG): container finished" podID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerID="e73ef79d3af299f20add2e7b2ec2214ddcb364c22e22354f52021597d1cd69a0" exitCode=2 Dec 04 17:58:34 crc kubenswrapper[4948]: I1204 17:58:34.672032 4948 generic.go:334] "Generic (PLEG): container finished" podID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerID="3dfc514c6e426cb101de9791936ede5dd2696e9f714efe839e8d0318b4422575" exitCode=0 Dec 04 17:58:34 crc kubenswrapper[4948]: I1204 17:58:34.671796 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerDied","Data":"0e4b4f08bdf858474b8cb10f21f7d3641da527593681765bd9b01a6b0300a56e"} Dec 04 17:58:34 crc kubenswrapper[4948]: I1204 17:58:34.672183 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerDied","Data":"e73ef79d3af299f20add2e7b2ec2214ddcb364c22e22354f52021597d1cd69a0"} Dec 04 17:58:34 crc kubenswrapper[4948]: I1204 17:58:34.672211 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerDied","Data":"3dfc514c6e426cb101de9791936ede5dd2696e9f714efe839e8d0318b4422575"} Dec 04 17:58:38 crc kubenswrapper[4948]: I1204 17:58:38.710458 4948 generic.go:334] "Generic (PLEG): container finished" podID="c9360a48-9890-45f3-8fc3-551ba8c1521e" containerID="9624c3c7c396b5a65af21a0e1c05976f033b2146793a0b81e5ffc20f4e076487" exitCode=0 Dec 04 17:58:38 crc kubenswrapper[4948]: I1204 17:58:38.710741 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" event={"ID":"c9360a48-9890-45f3-8fc3-551ba8c1521e","Type":"ContainerDied","Data":"9624c3c7c396b5a65af21a0e1c05976f033b2146793a0b81e5ffc20f4e076487"} Dec 04 17:58:38 crc kubenswrapper[4948]: I1204 17:58:38.714966 4948 generic.go:334] "Generic (PLEG): container finished" podID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerID="1fe66982e0e16abfd0746da1294a7911efa6d671fed2c7d0ca82493dcb71d88e" exitCode=0 Dec 04 17:58:38 crc kubenswrapper[4948]: I1204 17:58:38.715004 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerDied","Data":"1fe66982e0e16abfd0746da1294a7911efa6d671fed2c7d0ca82493dcb71d88e"} Dec 04 17:58:38 crc kubenswrapper[4948]: I1204 17:58:38.921218 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.016717 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-run-httpd\") pod \"356cc75c-7afc-4735-ad87-1a80fd0317ff\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.016757 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-sg-core-conf-yaml\") pod \"356cc75c-7afc-4735-ad87-1a80fd0317ff\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.016809 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-combined-ca-bundle\") pod \"356cc75c-7afc-4735-ad87-1a80fd0317ff\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.016828 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-log-httpd\") pod \"356cc75c-7afc-4735-ad87-1a80fd0317ff\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.016869 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-config-data\") pod \"356cc75c-7afc-4735-ad87-1a80fd0317ff\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.016895 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddrxq\" (UniqueName: \"kubernetes.io/projected/356cc75c-7afc-4735-ad87-1a80fd0317ff-kube-api-access-ddrxq\") pod \"356cc75c-7afc-4735-ad87-1a80fd0317ff\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.016944 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-scripts\") pod \"356cc75c-7afc-4735-ad87-1a80fd0317ff\" (UID: \"356cc75c-7afc-4735-ad87-1a80fd0317ff\") " Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.017690 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "356cc75c-7afc-4735-ad87-1a80fd0317ff" (UID: "356cc75c-7afc-4735-ad87-1a80fd0317ff"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.017795 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "356cc75c-7afc-4735-ad87-1a80fd0317ff" (UID: "356cc75c-7afc-4735-ad87-1a80fd0317ff"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.018347 4948 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.018368 4948 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/356cc75c-7afc-4735-ad87-1a80fd0317ff-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.030868 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356cc75c-7afc-4735-ad87-1a80fd0317ff-kube-api-access-ddrxq" (OuterVolumeSpecName: "kube-api-access-ddrxq") pod "356cc75c-7afc-4735-ad87-1a80fd0317ff" (UID: "356cc75c-7afc-4735-ad87-1a80fd0317ff"). InnerVolumeSpecName "kube-api-access-ddrxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.031151 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-scripts" (OuterVolumeSpecName: "scripts") pod "356cc75c-7afc-4735-ad87-1a80fd0317ff" (UID: "356cc75c-7afc-4735-ad87-1a80fd0317ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.042701 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "356cc75c-7afc-4735-ad87-1a80fd0317ff" (UID: "356cc75c-7afc-4735-ad87-1a80fd0317ff"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.086619 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "356cc75c-7afc-4735-ad87-1a80fd0317ff" (UID: "356cc75c-7afc-4735-ad87-1a80fd0317ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.114831 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-config-data" (OuterVolumeSpecName: "config-data") pod "356cc75c-7afc-4735-ad87-1a80fd0317ff" (UID: "356cc75c-7afc-4735-ad87-1a80fd0317ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.120119 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.120256 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.120336 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddrxq\" (UniqueName: \"kubernetes.io/projected/356cc75c-7afc-4735-ad87-1a80fd0317ff-kube-api-access-ddrxq\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.120420 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.120494 4948 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/356cc75c-7afc-4735-ad87-1a80fd0317ff-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.726866 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"356cc75c-7afc-4735-ad87-1a80fd0317ff","Type":"ContainerDied","Data":"6a86a1aa9b52660ddea6d595aded9af9ad358983a00d21ad479e90728ecbd2c0"} Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.726947 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.727244 4948 scope.go:117] "RemoveContainer" containerID="0e4b4f08bdf858474b8cb10f21f7d3641da527593681765bd9b01a6b0300a56e" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.772132 4948 scope.go:117] "RemoveContainer" containerID="e73ef79d3af299f20add2e7b2ec2214ddcb364c22e22354f52021597d1cd69a0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.776859 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.786867 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.796446 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:39 crc kubenswrapper[4948]: E1204 17:58:39.801680 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="ceilometer-central-agent" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.801710 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="ceilometer-central-agent" Dec 04 17:58:39 crc kubenswrapper[4948]: E1204 17:58:39.801728 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="sg-core" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.801735 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="sg-core" Dec 04 17:58:39 crc kubenswrapper[4948]: E1204 17:58:39.801746 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="proxy-httpd" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.801752 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="proxy-httpd" Dec 04 17:58:39 crc kubenswrapper[4948]: E1204 17:58:39.801765 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="ceilometer-notification-agent" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.801771 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="ceilometer-notification-agent" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.801927 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="proxy-httpd" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.801937 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="ceilometer-central-agent" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.801947 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="ceilometer-notification-agent" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.802534 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" containerName="sg-core" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.804302 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.806379 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.806678 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.808485 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.830551 4948 scope.go:117] "RemoveContainer" containerID="3dfc514c6e426cb101de9791936ede5dd2696e9f714efe839e8d0318b4422575" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.873434 4948 scope.go:117] "RemoveContainer" containerID="1fe66982e0e16abfd0746da1294a7911efa6d671fed2c7d0ca82493dcb71d88e" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.939399 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-config-data\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.939453 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-run-httpd\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.939506 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-log-httpd\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.939718 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.939861 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-scripts\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.940015 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjc56\" (UniqueName: \"kubernetes.io/projected/3fcef507-b266-492a-8877-f773828b5b0f-kube-api-access-fjc56\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:39 crc kubenswrapper[4948]: I1204 17:58:39.940058 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.042343 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-log-httpd\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.042431 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.042504 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-scripts\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.042554 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjc56\" (UniqueName: \"kubernetes.io/projected/3fcef507-b266-492a-8877-f773828b5b0f-kube-api-access-fjc56\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.042577 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.042667 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-config-data\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.042701 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-run-httpd\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.043572 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-log-httpd\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.043624 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-run-httpd\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.047176 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.047908 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-config-data\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.054932 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.062072 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjc56\" (UniqueName: \"kubernetes.io/projected/3fcef507-b266-492a-8877-f773828b5b0f-kube-api-access-fjc56\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.063360 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-scripts\") pod \"ceilometer-0\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.135145 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.149152 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.246528 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-config-data\") pod \"c9360a48-9890-45f3-8fc3-551ba8c1521e\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.246578 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-scripts\") pod \"c9360a48-9890-45f3-8fc3-551ba8c1521e\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.246670 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-combined-ca-bundle\") pod \"c9360a48-9890-45f3-8fc3-551ba8c1521e\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.246699 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bwfj\" (UniqueName: \"kubernetes.io/projected/c9360a48-9890-45f3-8fc3-551ba8c1521e-kube-api-access-6bwfj\") pod \"c9360a48-9890-45f3-8fc3-551ba8c1521e\" (UID: \"c9360a48-9890-45f3-8fc3-551ba8c1521e\") " Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.250558 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9360a48-9890-45f3-8fc3-551ba8c1521e-kube-api-access-6bwfj" (OuterVolumeSpecName: "kube-api-access-6bwfj") pod "c9360a48-9890-45f3-8fc3-551ba8c1521e" (UID: "c9360a48-9890-45f3-8fc3-551ba8c1521e"). InnerVolumeSpecName "kube-api-access-6bwfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.251941 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-scripts" (OuterVolumeSpecName: "scripts") pod "c9360a48-9890-45f3-8fc3-551ba8c1521e" (UID: "c9360a48-9890-45f3-8fc3-551ba8c1521e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.282554 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9360a48-9890-45f3-8fc3-551ba8c1521e" (UID: "c9360a48-9890-45f3-8fc3-551ba8c1521e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.313069 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-config-data" (OuterVolumeSpecName: "config-data") pod "c9360a48-9890-45f3-8fc3-551ba8c1521e" (UID: "c9360a48-9890-45f3-8fc3-551ba8c1521e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.349560 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.349959 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bwfj\" (UniqueName: \"kubernetes.io/projected/c9360a48-9890-45f3-8fc3-551ba8c1521e-kube-api-access-6bwfj\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.349994 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.350019 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9360a48-9890-45f3-8fc3-551ba8c1521e-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.695240 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.736460 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerStarted","Data":"b3b7da2ace18f4c86a3c1719035e0e96e88afeb665af336abfa4ab80e3a51508"} Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.740809 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" event={"ID":"c9360a48-9890-45f3-8fc3-551ba8c1521e","Type":"ContainerDied","Data":"d970e59e630e5f4063c4a91299a64a90989ce3306bfac4fe91fb4128b0600e93"} Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.740857 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d970e59e630e5f4063c4a91299a64a90989ce3306bfac4fe91fb4128b0600e93" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.740925 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-xbz9m" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.808383 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 17:58:40 crc kubenswrapper[4948]: E1204 17:58:40.808852 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9360a48-9890-45f3-8fc3-551ba8c1521e" containerName="nova-cell0-conductor-db-sync" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.808877 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9360a48-9890-45f3-8fc3-551ba8c1521e" containerName="nova-cell0-conductor-db-sync" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.809113 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9360a48-9890-45f3-8fc3-551ba8c1521e" containerName="nova-cell0-conductor-db-sync" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.809882 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.812262 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.813932 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-g2l6n" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.821427 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.932156 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="356cc75c-7afc-4735-ad87-1a80fd0317ff" path="/var/lib/kubelet/pods/356cc75c-7afc-4735-ad87-1a80fd0317ff/volumes" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.968003 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.968221 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:40 crc kubenswrapper[4948]: I1204 17:58:40.968254 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq8tf\" (UniqueName: \"kubernetes.io/projected/214441b7-69b1-4518-a135-73de11d39a1d-kube-api-access-rq8tf\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.069400 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.069785 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.069808 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq8tf\" (UniqueName: \"kubernetes.io/projected/214441b7-69b1-4518-a135-73de11d39a1d-kube-api-access-rq8tf\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.075619 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.076547 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.103592 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq8tf\" (UniqueName: \"kubernetes.io/projected/214441b7-69b1-4518-a135-73de11d39a1d-kube-api-access-rq8tf\") pod \"nova-cell0-conductor-0\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.124846 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.598294 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 17:58:41 crc kubenswrapper[4948]: W1204 17:58:41.605916 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod214441b7_69b1_4518_a135_73de11d39a1d.slice/crio-217ca86a7b9acc6c236f585c2eec3413fa21473be47f3842fd32b9477abf3d12 WatchSource:0}: Error finding container 217ca86a7b9acc6c236f585c2eec3413fa21473be47f3842fd32b9477abf3d12: Status 404 returned error can't find the container with id 217ca86a7b9acc6c236f585c2eec3413fa21473be47f3842fd32b9477abf3d12 Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.753866 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerStarted","Data":"5f98cd6378510fbf07495b70d5dd6ac6dfb9412e84148157e5fea6706430acf5"} Dec 04 17:58:41 crc kubenswrapper[4948]: I1204 17:58:41.755218 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"214441b7-69b1-4518-a135-73de11d39a1d","Type":"ContainerStarted","Data":"217ca86a7b9acc6c236f585c2eec3413fa21473be47f3842fd32b9477abf3d12"} Dec 04 17:58:42 crc kubenswrapper[4948]: I1204 17:58:42.765136 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"214441b7-69b1-4518-a135-73de11d39a1d","Type":"ContainerStarted","Data":"1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f"} Dec 04 17:58:42 crc kubenswrapper[4948]: I1204 17:58:42.765437 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:42 crc kubenswrapper[4948]: I1204 17:58:42.776759 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerStarted","Data":"e12f22149b667115048940a93e4c08168a5eed7b4451bd6cc55a9ed2dac86031"} Dec 04 17:58:42 crc kubenswrapper[4948]: I1204 17:58:42.788480 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.78845984 podStartE2EDuration="2.78845984s" podCreationTimestamp="2025-12-04 17:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:42.779408209 +0000 UTC m=+1934.140482611" watchObservedRunningTime="2025-12-04 17:58:42.78845984 +0000 UTC m=+1934.149534242" Dec 04 17:58:43 crc kubenswrapper[4948]: I1204 17:58:43.792579 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerStarted","Data":"b0d8cedf17b3bdea647b1eaa2b97547299327f395dded493f35e180440142a2c"} Dec 04 17:58:44 crc kubenswrapper[4948]: I1204 17:58:44.801732 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerStarted","Data":"0b0d4c57d69658e7e6ba20c11a9023d6e456f46d1183ca77fdd5aa0bb0a78b1a"} Dec 04 17:58:44 crc kubenswrapper[4948]: I1204 17:58:44.802090 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 17:58:44 crc kubenswrapper[4948]: I1204 17:58:44.830974 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.092525 podStartE2EDuration="5.830952405s" podCreationTimestamp="2025-12-04 17:58:39 +0000 UTC" firstStartedPulling="2025-12-04 17:58:40.697721887 +0000 UTC m=+1932.058796289" lastFinishedPulling="2025-12-04 17:58:44.436149272 +0000 UTC m=+1935.797223694" observedRunningTime="2025-12-04 17:58:44.828265959 +0000 UTC m=+1936.189340361" watchObservedRunningTime="2025-12-04 17:58:44.830952405 +0000 UTC m=+1936.192026817" Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.162359 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.735454 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-2qm6f"] Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.736803 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.741059 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.741391 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.761722 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2qm6f"] Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.908250 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-scripts\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.908482 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-config-data\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.908595 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:51 crc kubenswrapper[4948]: I1204 17:58:51.908686 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndnv9\" (UniqueName: \"kubernetes.io/projected/c608c956-a885-4f52-8f3c-24e9f5283cb3-kube-api-access-ndnv9\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.010057 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-scripts\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.010393 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-config-data\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.010493 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.011023 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndnv9\" (UniqueName: \"kubernetes.io/projected/c608c956-a885-4f52-8f3c-24e9f5283cb3-kube-api-access-ndnv9\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.019186 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-config-data\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.019521 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-scripts\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.027226 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.048453 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndnv9\" (UniqueName: \"kubernetes.io/projected/c608c956-a885-4f52-8f3c-24e9f5283cb3-kube-api-access-ndnv9\") pod \"nova-cell0-cell-mapping-2qm6f\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.064096 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.110123 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.111605 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.114998 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.115411 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.136620 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.136740 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.140289 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.163992 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.239855 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.240103 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd37406-58df-4458-a09b-61f53fc18edf-logs\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.240190 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f6h8\" (UniqueName: \"kubernetes.io/projected/ddd37406-58df-4458-a09b-61f53fc18edf-kube-api-access-9f6h8\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.240264 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq8ws\" (UniqueName: \"kubernetes.io/projected/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-kube-api-access-vq8ws\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.240376 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.240399 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-config-data\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.240431 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-config-data\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.271862 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.289499 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.296643 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347465 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq8ws\" (UniqueName: \"kubernetes.io/projected/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-kube-api-access-vq8ws\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347555 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-config-data\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347598 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347629 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347658 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-config-data\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347689 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-config-data\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347732 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347757 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-logs\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347798 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrxw\" (UniqueName: \"kubernetes.io/projected/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-kube-api-access-vnrxw\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347882 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd37406-58df-4458-a09b-61f53fc18edf-logs\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.347946 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f6h8\" (UniqueName: \"kubernetes.io/projected/ddd37406-58df-4458-a09b-61f53fc18edf-kube-api-access-9f6h8\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.358752 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.366553 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-config-data\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.374220 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.376538 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-config-data\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.376900 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd37406-58df-4458-a09b-61f53fc18edf-logs\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.376944 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.378224 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq8ws\" (UniqueName: \"kubernetes.io/projected/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-kube-api-access-vq8ws\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.385742 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.389169 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.394630 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f6h8\" (UniqueName: \"kubernetes.io/projected/ddd37406-58df-4458-a09b-61f53fc18edf-kube-api-access-9f6h8\") pod \"nova-metadata-0\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.403308 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.419891 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.424028 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btrns"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.425699 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.433277 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btrns"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456781 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456827 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-svc\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456845 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456878 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k64s\" (UniqueName: \"kubernetes.io/projected/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-kube-api-access-5k64s\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456912 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-logs\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456935 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456955 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrxw\" (UniqueName: \"kubernetes.io/projected/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-kube-api-access-vnrxw\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456969 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.456999 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.457036 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-config\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.457073 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.457108 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kmz5\" (UniqueName: \"kubernetes.io/projected/36484b96-9927-4fae-b2b1-95c5bf766b21-kube-api-access-4kmz5\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.457146 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-config-data\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.458082 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-logs\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.460658 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.460758 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-config-data\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.474352 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrxw\" (UniqueName: \"kubernetes.io/projected/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-kube-api-access-vnrxw\") pod \"nova-api-0\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560661 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-config\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560713 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560758 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kmz5\" (UniqueName: \"kubernetes.io/projected/36484b96-9927-4fae-b2b1-95c5bf766b21-kube-api-access-4kmz5\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560815 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-svc\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560831 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560860 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k64s\" (UniqueName: \"kubernetes.io/projected/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-kube-api-access-5k64s\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560894 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560915 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.560942 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.561812 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.562345 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-config\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.563517 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.564504 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.564573 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-svc\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.566288 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.567478 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.584443 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k64s\" (UniqueName: \"kubernetes.io/projected/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-kube-api-access-5k64s\") pod \"dnsmasq-dns-bccf8f775-btrns\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.585098 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.586127 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kmz5\" (UniqueName: \"kubernetes.io/projected/36484b96-9927-4fae-b2b1-95c5bf766b21-kube-api-access-4kmz5\") pod \"nova-cell1-novncproxy-0\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.599813 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.620462 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.728499 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2qm6f"] Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.728527 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.742205 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:52 crc kubenswrapper[4948]: I1204 17:58:52.931694 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2qm6f" event={"ID":"c608c956-a885-4f52-8f3c-24e9f5283cb3","Type":"ContainerStarted","Data":"c50d03c708fae06a7b8e841d2733e8c82536bfb7d7fc345d73fdcfee9242ab6d"} Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.035751 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p2zk5"] Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.037227 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.043661 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.043878 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.052706 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p2zk5"] Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.106219 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.116198 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:53 crc kubenswrapper[4948]: W1204 17:58:53.139549 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd37406_58df_4458_a09b_61f53fc18edf.slice/crio-1611ac99dd1209312bfdc6c7a16103283b7eb00c5eed8bedca1be6d78f6d2894 WatchSource:0}: Error finding container 1611ac99dd1209312bfdc6c7a16103283b7eb00c5eed8bedca1be6d78f6d2894: Status 404 returned error can't find the container with id 1611ac99dd1209312bfdc6c7a16103283b7eb00c5eed8bedca1be6d78f6d2894 Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.172379 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.172985 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnz8\" (UniqueName: \"kubernetes.io/projected/ed713933-db04-4fdc-805d-7306d1cf2ec3-kube-api-access-rvnz8\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.173010 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-config-data\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.173304 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-scripts\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.262151 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.272953 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.274950 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-scripts\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.275000 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.275188 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnz8\" (UniqueName: \"kubernetes.io/projected/ed713933-db04-4fdc-805d-7306d1cf2ec3-kube-api-access-rvnz8\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.275221 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-config-data\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.280224 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.280730 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-config-data\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.288954 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-scripts\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.291626 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnz8\" (UniqueName: \"kubernetes.io/projected/ed713933-db04-4fdc-805d-7306d1cf2ec3-kube-api-access-rvnz8\") pod \"nova-cell1-conductor-db-sync-p2zk5\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.415074 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.461629 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btrns"] Dec 04 17:58:53 crc kubenswrapper[4948]: W1204 17:58:53.465305 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69bffb9a_7476_4f8b_a3ab_7e1bce0cba55.slice/crio-31483a4ebc909425e269f84686e6c88a28483fccbee64b2bb77dbf9f9d69bd95 WatchSource:0}: Error finding container 31483a4ebc909425e269f84686e6c88a28483fccbee64b2bb77dbf9f9d69bd95: Status 404 returned error can't find the container with id 31483a4ebc909425e269f84686e6c88a28483fccbee64b2bb77dbf9f9d69bd95 Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.872745 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p2zk5"] Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.935315 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c","Type":"ContainerStarted","Data":"05f065fd9e1b4bd73b40e0d1472bc3b6cf5ed6c65275ed0a21406da09ed7dd4e"} Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.938350 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"36484b96-9927-4fae-b2b1-95c5bf766b21","Type":"ContainerStarted","Data":"cb67c6fac30128127e161596ee8dcfcd8fe5a9dd5beb1f2c2cf028b0e6bfb4dd"} Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.940304 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa","Type":"ContainerStarted","Data":"24a56fc1b7247c29730590d1abcfe4c5db4b785831d0a5017b7dd53c2cc7cb28"} Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.941794 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ddd37406-58df-4458-a09b-61f53fc18edf","Type":"ContainerStarted","Data":"1611ac99dd1209312bfdc6c7a16103283b7eb00c5eed8bedca1be6d78f6d2894"} Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.946770 4948 generic.go:334] "Generic (PLEG): container finished" podID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" containerID="0717fae57b36833de846ba2b339e69c1a155433d662f30df7772e1787eafb4f1" exitCode=0 Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.946822 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btrns" event={"ID":"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55","Type":"ContainerDied","Data":"0717fae57b36833de846ba2b339e69c1a155433d662f30df7772e1787eafb4f1"} Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.946837 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btrns" event={"ID":"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55","Type":"ContainerStarted","Data":"31483a4ebc909425e269f84686e6c88a28483fccbee64b2bb77dbf9f9d69bd95"} Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.949866 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2qm6f" event={"ID":"c608c956-a885-4f52-8f3c-24e9f5283cb3","Type":"ContainerStarted","Data":"89b87f88c0e902dc36dbe577d822060dd19a3f3fc060e52e6927682aa0513a8c"} Dec 04 17:58:53 crc kubenswrapper[4948]: I1204 17:58:53.953758 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" event={"ID":"ed713933-db04-4fdc-805d-7306d1cf2ec3","Type":"ContainerStarted","Data":"36e4d51ae69dafd368a7f4bc1bf3568b40e17ef46f606dd406ea91c9e271a8a1"} Dec 04 17:58:54 crc kubenswrapper[4948]: I1204 17:58:54.004278 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-2qm6f" podStartSLOduration=3.004254256 podStartE2EDuration="3.004254256s" podCreationTimestamp="2025-12-04 17:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:54.001253084 +0000 UTC m=+1945.362327506" watchObservedRunningTime="2025-12-04 17:58:54.004254256 +0000 UTC m=+1945.365328658" Dec 04 17:58:54 crc kubenswrapper[4948]: I1204 17:58:54.975890 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btrns" event={"ID":"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55","Type":"ContainerStarted","Data":"5f19696cdf8f2b7ca40c61e974811c8958a4c4252d9798e3c962a7dc83f23ba1"} Dec 04 17:58:54 crc kubenswrapper[4948]: I1204 17:58:54.976492 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:58:54 crc kubenswrapper[4948]: I1204 17:58:54.978676 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" event={"ID":"ed713933-db04-4fdc-805d-7306d1cf2ec3","Type":"ContainerStarted","Data":"23fbbace28d9a40084a9fe5535d672e5b1491a24cd558b5f91e5c18a8993a49e"} Dec 04 17:58:55 crc kubenswrapper[4948]: I1204 17:58:55.000228 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-btrns" podStartSLOduration=3.000213507 podStartE2EDuration="3.000213507s" podCreationTimestamp="2025-12-04 17:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:54.99336919 +0000 UTC m=+1946.354443592" watchObservedRunningTime="2025-12-04 17:58:55.000213507 +0000 UTC m=+1946.361287909" Dec 04 17:58:55 crc kubenswrapper[4948]: I1204 17:58:55.023815 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" podStartSLOduration=2.023798353 podStartE2EDuration="2.023798353s" podCreationTimestamp="2025-12-04 17:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:58:55.014085326 +0000 UTC m=+1946.375159728" watchObservedRunningTime="2025-12-04 17:58:55.023798353 +0000 UTC m=+1946.384872755" Dec 04 17:58:55 crc kubenswrapper[4948]: I1204 17:58:55.871639 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:58:55 crc kubenswrapper[4948]: I1204 17:58:55.929370 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:56 crc kubenswrapper[4948]: I1204 17:58:56.998242 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"36484b96-9927-4fae-b2b1-95c5bf766b21","Type":"ContainerStarted","Data":"9ae407ba7abe25be3566329a0b4d24921bd780cb2e8daa33169ca790f719ed14"} Dec 04 17:58:56 crc kubenswrapper[4948]: I1204 17:58:56.998530 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="36484b96-9927-4fae-b2b1-95c5bf766b21" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9ae407ba7abe25be3566329a0b4d24921bd780cb2e8daa33169ca790f719ed14" gracePeriod=30 Dec 04 17:58:57 crc kubenswrapper[4948]: I1204 17:58:57.001076 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa","Type":"ContainerStarted","Data":"4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd"} Dec 04 17:58:57 crc kubenswrapper[4948]: I1204 17:58:57.004026 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ddd37406-58df-4458-a09b-61f53fc18edf","Type":"ContainerStarted","Data":"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892"} Dec 04 17:58:57 crc kubenswrapper[4948]: I1204 17:58:57.005521 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c","Type":"ContainerStarted","Data":"4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2"} Dec 04 17:58:57 crc kubenswrapper[4948]: I1204 17:58:57.027795 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.807407264 podStartE2EDuration="5.027773569s" podCreationTimestamp="2025-12-04 17:58:52 +0000 UTC" firstStartedPulling="2025-12-04 17:58:53.266629259 +0000 UTC m=+1944.627703661" lastFinishedPulling="2025-12-04 17:58:56.486995564 +0000 UTC m=+1947.848069966" observedRunningTime="2025-12-04 17:58:57.015226882 +0000 UTC m=+1948.376301294" watchObservedRunningTime="2025-12-04 17:58:57.027773569 +0000 UTC m=+1948.388847981" Dec 04 17:58:57 crc kubenswrapper[4948]: I1204 17:58:57.037361 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.686326929 podStartE2EDuration="5.037340022s" podCreationTimestamp="2025-12-04 17:58:52 +0000 UTC" firstStartedPulling="2025-12-04 17:58:53.126342816 +0000 UTC m=+1944.487417218" lastFinishedPulling="2025-12-04 17:58:56.477355899 +0000 UTC m=+1947.838430311" observedRunningTime="2025-12-04 17:58:57.034176725 +0000 UTC m=+1948.395251127" watchObservedRunningTime="2025-12-04 17:58:57.037340022 +0000 UTC m=+1948.398414424" Dec 04 17:58:57 crc kubenswrapper[4948]: I1204 17:58:57.601316 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 17:58:57 crc kubenswrapper[4948]: I1204 17:58:57.729135 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.020582 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ddd37406-58df-4458-a09b-61f53fc18edf","Type":"ContainerStarted","Data":"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8"} Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.020767 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" containerName="nova-metadata-log" containerID="cri-o://6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892" gracePeriod=30 Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.020888 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" containerName="nova-metadata-metadata" containerID="cri-o://83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8" gracePeriod=30 Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.023120 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c","Type":"ContainerStarted","Data":"a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a"} Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.065518 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.71753273 podStartE2EDuration="6.065499678s" podCreationTimestamp="2025-12-04 17:58:52 +0000 UTC" firstStartedPulling="2025-12-04 17:58:53.143667279 +0000 UTC m=+1944.504741681" lastFinishedPulling="2025-12-04 17:58:56.491634227 +0000 UTC m=+1947.852708629" observedRunningTime="2025-12-04 17:58:58.053653909 +0000 UTC m=+1949.414728311" watchObservedRunningTime="2025-12-04 17:58:58.065499678 +0000 UTC m=+1949.426574070" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.090198 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.879712017 podStartE2EDuration="6.09017153s" podCreationTimestamp="2025-12-04 17:58:52 +0000 UTC" firstStartedPulling="2025-12-04 17:58:53.266900176 +0000 UTC m=+1944.627974578" lastFinishedPulling="2025-12-04 17:58:56.477359689 +0000 UTC m=+1947.838434091" observedRunningTime="2025-12-04 17:58:58.076529567 +0000 UTC m=+1949.437604009" watchObservedRunningTime="2025-12-04 17:58:58.09017153 +0000 UTC m=+1949.451245952" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.730954 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.797624 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f6h8\" (UniqueName: \"kubernetes.io/projected/ddd37406-58df-4458-a09b-61f53fc18edf-kube-api-access-9f6h8\") pod \"ddd37406-58df-4458-a09b-61f53fc18edf\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.797676 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-combined-ca-bundle\") pod \"ddd37406-58df-4458-a09b-61f53fc18edf\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.797848 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd37406-58df-4458-a09b-61f53fc18edf-logs\") pod \"ddd37406-58df-4458-a09b-61f53fc18edf\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.797906 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-config-data\") pod \"ddd37406-58df-4458-a09b-61f53fc18edf\" (UID: \"ddd37406-58df-4458-a09b-61f53fc18edf\") " Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.798900 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd37406-58df-4458-a09b-61f53fc18edf-logs" (OuterVolumeSpecName: "logs") pod "ddd37406-58df-4458-a09b-61f53fc18edf" (UID: "ddd37406-58df-4458-a09b-61f53fc18edf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.804300 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd37406-58df-4458-a09b-61f53fc18edf-kube-api-access-9f6h8" (OuterVolumeSpecName: "kube-api-access-9f6h8") pod "ddd37406-58df-4458-a09b-61f53fc18edf" (UID: "ddd37406-58df-4458-a09b-61f53fc18edf"). InnerVolumeSpecName "kube-api-access-9f6h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.854257 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-config-data" (OuterVolumeSpecName: "config-data") pod "ddd37406-58df-4458-a09b-61f53fc18edf" (UID: "ddd37406-58df-4458-a09b-61f53fc18edf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.873023 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd37406-58df-4458-a09b-61f53fc18edf" (UID: "ddd37406-58df-4458-a09b-61f53fc18edf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.901826 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd37406-58df-4458-a09b-61f53fc18edf-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.902205 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.902396 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f6h8\" (UniqueName: \"kubernetes.io/projected/ddd37406-58df-4458-a09b-61f53fc18edf-kube-api-access-9f6h8\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:58 crc kubenswrapper[4948]: I1204 17:58:58.902473 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd37406-58df-4458-a09b-61f53fc18edf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.034119 4948 generic.go:334] "Generic (PLEG): container finished" podID="ddd37406-58df-4458-a09b-61f53fc18edf" containerID="83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8" exitCode=0 Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.035289 4948 generic.go:334] "Generic (PLEG): container finished" podID="ddd37406-58df-4458-a09b-61f53fc18edf" containerID="6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892" exitCode=143 Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.036154 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.036656 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ddd37406-58df-4458-a09b-61f53fc18edf","Type":"ContainerDied","Data":"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8"} Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.036752 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ddd37406-58df-4458-a09b-61f53fc18edf","Type":"ContainerDied","Data":"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892"} Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.036814 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ddd37406-58df-4458-a09b-61f53fc18edf","Type":"ContainerDied","Data":"1611ac99dd1209312bfdc6c7a16103283b7eb00c5eed8bedca1be6d78f6d2894"} Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.036888 4948 scope.go:117] "RemoveContainer" containerID="83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.063292 4948 scope.go:117] "RemoveContainer" containerID="6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.067493 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.089711 4948 scope.go:117] "RemoveContainer" containerID="83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8" Dec 04 17:58:59 crc kubenswrapper[4948]: E1204 17:58:59.090224 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8\": container with ID starting with 83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8 not found: ID does not exist" containerID="83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.090266 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8"} err="failed to get container status \"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8\": rpc error: code = NotFound desc = could not find container \"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8\": container with ID starting with 83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8 not found: ID does not exist" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.090314 4948 scope.go:117] "RemoveContainer" containerID="6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892" Dec 04 17:58:59 crc kubenswrapper[4948]: E1204 17:58:59.090727 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892\": container with ID starting with 6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892 not found: ID does not exist" containerID="6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.090778 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892"} err="failed to get container status \"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892\": rpc error: code = NotFound desc = could not find container \"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892\": container with ID starting with 6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892 not found: ID does not exist" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.090810 4948 scope.go:117] "RemoveContainer" containerID="83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.091931 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8"} err="failed to get container status \"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8\": rpc error: code = NotFound desc = could not find container \"83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8\": container with ID starting with 83d106d71b7717debb375674def387864bbb4177f09abafa855fd2993f319ba8 not found: ID does not exist" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.091966 4948 scope.go:117] "RemoveContainer" containerID="6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.092264 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892"} err="failed to get container status \"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892\": rpc error: code = NotFound desc = could not find container \"6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892\": container with ID starting with 6960b096918fa173e529f0f09a0af3138f24fb67803c2030de5d107c0fe21892 not found: ID does not exist" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.096678 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.108102 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:59 crc kubenswrapper[4948]: E1204 17:58:59.108659 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" containerName="nova-metadata-log" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.108675 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" containerName="nova-metadata-log" Dec 04 17:58:59 crc kubenswrapper[4948]: E1204 17:58:59.108687 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" containerName="nova-metadata-metadata" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.108696 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" containerName="nova-metadata-metadata" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.108982 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" containerName="nova-metadata-log" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.109001 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" containerName="nova-metadata-metadata" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.110283 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.113626 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.113909 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.132829 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.211486 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96898b41-f590-489d-a5c4-e929344f2a14-logs\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.211552 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.211645 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-config-data\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.211684 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.211757 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtmd\" (UniqueName: \"kubernetes.io/projected/96898b41-f590-489d-a5c4-e929344f2a14-kube-api-access-bhtmd\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.312904 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96898b41-f590-489d-a5c4-e929344f2a14-logs\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.312999 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.313113 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-config-data\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.313172 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.313232 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtmd\" (UniqueName: \"kubernetes.io/projected/96898b41-f590-489d-a5c4-e929344f2a14-kube-api-access-bhtmd\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.313419 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96898b41-f590-489d-a5c4-e929344f2a14-logs\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.318073 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.319507 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.322957 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-config-data\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.328849 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtmd\" (UniqueName: \"kubernetes.io/projected/96898b41-f590-489d-a5c4-e929344f2a14-kube-api-access-bhtmd\") pod \"nova-metadata-0\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.427655 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:58:59 crc kubenswrapper[4948]: I1204 17:58:59.891379 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:58:59 crc kubenswrapper[4948]: W1204 17:58:59.895743 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96898b41_f590_489d_a5c4_e929344f2a14.slice/crio-0626814c86fd795c8662afd214b5c471ca649aa3b4bcdcd8e52b281d4d754397 WatchSource:0}: Error finding container 0626814c86fd795c8662afd214b5c471ca649aa3b4bcdcd8e52b281d4d754397: Status 404 returned error can't find the container with id 0626814c86fd795c8662afd214b5c471ca649aa3b4bcdcd8e52b281d4d754397 Dec 04 17:59:00 crc kubenswrapper[4948]: I1204 17:59:00.047811 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96898b41-f590-489d-a5c4-e929344f2a14","Type":"ContainerStarted","Data":"0626814c86fd795c8662afd214b5c471ca649aa3b4bcdcd8e52b281d4d754397"} Dec 04 17:59:00 crc kubenswrapper[4948]: I1204 17:59:00.929285 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd37406-58df-4458-a09b-61f53fc18edf" path="/var/lib/kubelet/pods/ddd37406-58df-4458-a09b-61f53fc18edf/volumes" Dec 04 17:59:01 crc kubenswrapper[4948]: I1204 17:59:01.061844 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96898b41-f590-489d-a5c4-e929344f2a14","Type":"ContainerStarted","Data":"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7"} Dec 04 17:59:01 crc kubenswrapper[4948]: I1204 17:59:01.061890 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96898b41-f590-489d-a5c4-e929344f2a14","Type":"ContainerStarted","Data":"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639"} Dec 04 17:59:01 crc kubenswrapper[4948]: I1204 17:59:01.064486 4948 generic.go:334] "Generic (PLEG): container finished" podID="c608c956-a885-4f52-8f3c-24e9f5283cb3" containerID="89b87f88c0e902dc36dbe577d822060dd19a3f3fc060e52e6927682aa0513a8c" exitCode=0 Dec 04 17:59:01 crc kubenswrapper[4948]: I1204 17:59:01.064523 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2qm6f" event={"ID":"c608c956-a885-4f52-8f3c-24e9f5283cb3","Type":"ContainerDied","Data":"89b87f88c0e902dc36dbe577d822060dd19a3f3fc060e52e6927682aa0513a8c"} Dec 04 17:59:01 crc kubenswrapper[4948]: I1204 17:59:01.084833 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.084812388 podStartE2EDuration="2.084812388s" podCreationTimestamp="2025-12-04 17:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:01.079321964 +0000 UTC m=+1952.440396376" watchObservedRunningTime="2025-12-04 17:59:01.084812388 +0000 UTC m=+1952.445886790" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.444317 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.585472 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-scripts\") pod \"c608c956-a885-4f52-8f3c-24e9f5283cb3\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.585636 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-config-data\") pod \"c608c956-a885-4f52-8f3c-24e9f5283cb3\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.585682 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-combined-ca-bundle\") pod \"c608c956-a885-4f52-8f3c-24e9f5283cb3\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.585742 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndnv9\" (UniqueName: \"kubernetes.io/projected/c608c956-a885-4f52-8f3c-24e9f5283cb3-kube-api-access-ndnv9\") pod \"c608c956-a885-4f52-8f3c-24e9f5283cb3\" (UID: \"c608c956-a885-4f52-8f3c-24e9f5283cb3\") " Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.592621 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-scripts" (OuterVolumeSpecName: "scripts") pod "c608c956-a885-4f52-8f3c-24e9f5283cb3" (UID: "c608c956-a885-4f52-8f3c-24e9f5283cb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.594464 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c608c956-a885-4f52-8f3c-24e9f5283cb3-kube-api-access-ndnv9" (OuterVolumeSpecName: "kube-api-access-ndnv9") pod "c608c956-a885-4f52-8f3c-24e9f5283cb3" (UID: "c608c956-a885-4f52-8f3c-24e9f5283cb3"). InnerVolumeSpecName "kube-api-access-ndnv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.600611 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.619877 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c608c956-a885-4f52-8f3c-24e9f5283cb3" (UID: "c608c956-a885-4f52-8f3c-24e9f5283cb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.620731 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.620794 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.637063 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.639628 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-config-data" (OuterVolumeSpecName: "config-data") pod "c608c956-a885-4f52-8f3c-24e9f5283cb3" (UID: "c608c956-a885-4f52-8f3c-24e9f5283cb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.688348 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.688396 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.688406 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndnv9\" (UniqueName: \"kubernetes.io/projected/c608c956-a885-4f52-8f3c-24e9f5283cb3-kube-api-access-ndnv9\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.688414 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c608c956-a885-4f52-8f3c-24e9f5283cb3-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.744268 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.809926 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-r8vh6"] Dec 04 17:59:02 crc kubenswrapper[4948]: I1204 17:59:02.810209 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" podUID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" containerName="dnsmasq-dns" containerID="cri-o://f2f5428c8fb22126200466715bb9c4edf354bd2a71a3ce55d28b33cc742b21d9" gracePeriod=10 Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.082092 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2qm6f" event={"ID":"c608c956-a885-4f52-8f3c-24e9f5283cb3","Type":"ContainerDied","Data":"c50d03c708fae06a7b8e841d2733e8c82536bfb7d7fc345d73fdcfee9242ab6d"} Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.082130 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c50d03c708fae06a7b8e841d2733e8c82536bfb7d7fc345d73fdcfee9242ab6d" Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.082215 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2qm6f" Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.084201 4948 generic.go:334] "Generic (PLEG): container finished" podID="ed713933-db04-4fdc-805d-7306d1cf2ec3" containerID="23fbbace28d9a40084a9fe5535d672e5b1491a24cd558b5f91e5c18a8993a49e" exitCode=0 Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.084347 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" event={"ID":"ed713933-db04-4fdc-805d-7306d1cf2ec3","Type":"ContainerDied","Data":"23fbbace28d9a40084a9fe5535d672e5b1491a24cd558b5f91e5c18a8993a49e"} Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.087510 4948 generic.go:334] "Generic (PLEG): container finished" podID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" containerID="f2f5428c8fb22126200466715bb9c4edf354bd2a71a3ce55d28b33cc742b21d9" exitCode=0 Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.087585 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" event={"ID":"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e","Type":"ContainerDied","Data":"f2f5428c8fb22126200466715bb9c4edf354bd2a71a3ce55d28b33cc742b21d9"} Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.136010 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.271182 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.271420 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-log" containerID="cri-o://4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2" gracePeriod=30 Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.271523 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-api" containerID="cri-o://a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a" gracePeriod=30 Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.278329 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": EOF" Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.283225 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": EOF" Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.290332 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.290555 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="96898b41-f590-489d-a5c4-e929344f2a14" containerName="nova-metadata-log" containerID="cri-o://b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639" gracePeriod=30 Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.290900 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="96898b41-f590-489d-a5c4-e929344f2a14" containerName="nova-metadata-metadata" containerID="cri-o://6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7" gracePeriod=30 Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.671425 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.837532 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.919439 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-svc\") pod \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.919561 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-sb\") pod \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.919592 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h4dl\" (UniqueName: \"kubernetes.io/projected/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-kube-api-access-7h4dl\") pod \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.919660 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-config\") pod \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.919679 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-nb\") pod \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.919723 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-swift-storage-0\") pod \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\" (UID: \"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e\") " Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.926480 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-kube-api-access-7h4dl" (OuterVolumeSpecName: "kube-api-access-7h4dl") pod "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" (UID: "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e"). InnerVolumeSpecName "kube-api-access-7h4dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.968264 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:59:03 crc kubenswrapper[4948]: I1204 17:59:03.981864 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" (UID: "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.002404 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" (UID: "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.002414 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" (UID: "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.005296 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-config" (OuterVolumeSpecName: "config") pod "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" (UID: "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.010004 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" (UID: "b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.021740 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-config-data\") pod \"96898b41-f590-489d-a5c4-e929344f2a14\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.021920 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96898b41-f590-489d-a5c4-e929344f2a14-logs\") pod \"96898b41-f590-489d-a5c4-e929344f2a14\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.022072 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-combined-ca-bundle\") pod \"96898b41-f590-489d-a5c4-e929344f2a14\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.022182 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96898b41-f590-489d-a5c4-e929344f2a14-logs" (OuterVolumeSpecName: "logs") pod "96898b41-f590-489d-a5c4-e929344f2a14" (UID: "96898b41-f590-489d-a5c4-e929344f2a14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.022299 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhtmd\" (UniqueName: \"kubernetes.io/projected/96898b41-f590-489d-a5c4-e929344f2a14-kube-api-access-bhtmd\") pod \"96898b41-f590-489d-a5c4-e929344f2a14\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.022465 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-nova-metadata-tls-certs\") pod \"96898b41-f590-489d-a5c4-e929344f2a14\" (UID: \"96898b41-f590-489d-a5c4-e929344f2a14\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.023077 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.023163 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.023227 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h4dl\" (UniqueName: \"kubernetes.io/projected/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-kube-api-access-7h4dl\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.023280 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.023340 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.023399 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.023455 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96898b41-f590-489d-a5c4-e929344f2a14-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.025544 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96898b41-f590-489d-a5c4-e929344f2a14-kube-api-access-bhtmd" (OuterVolumeSpecName: "kube-api-access-bhtmd") pod "96898b41-f590-489d-a5c4-e929344f2a14" (UID: "96898b41-f590-489d-a5c4-e929344f2a14"). InnerVolumeSpecName "kube-api-access-bhtmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.052450 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-config-data" (OuterVolumeSpecName: "config-data") pod "96898b41-f590-489d-a5c4-e929344f2a14" (UID: "96898b41-f590-489d-a5c4-e929344f2a14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.057323 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96898b41-f590-489d-a5c4-e929344f2a14" (UID: "96898b41-f590-489d-a5c4-e929344f2a14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.076540 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "96898b41-f590-489d-a5c4-e929344f2a14" (UID: "96898b41-f590-489d-a5c4-e929344f2a14"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.096898 4948 generic.go:334] "Generic (PLEG): container finished" podID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerID="4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2" exitCode=143 Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.096953 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c","Type":"ContainerDied","Data":"4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2"} Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.098332 4948 generic.go:334] "Generic (PLEG): container finished" podID="96898b41-f590-489d-a5c4-e929344f2a14" containerID="6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7" exitCode=0 Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.098362 4948 generic.go:334] "Generic (PLEG): container finished" podID="96898b41-f590-489d-a5c4-e929344f2a14" containerID="b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639" exitCode=143 Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.098428 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96898b41-f590-489d-a5c4-e929344f2a14","Type":"ContainerDied","Data":"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7"} Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.098444 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96898b41-f590-489d-a5c4-e929344f2a14","Type":"ContainerDied","Data":"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639"} Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.098453 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96898b41-f590-489d-a5c4-e929344f2a14","Type":"ContainerDied","Data":"0626814c86fd795c8662afd214b5c471ca649aa3b4bcdcd8e52b281d4d754397"} Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.098466 4948 scope.go:117] "RemoveContainer" containerID="6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.098574 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.105928 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.107225 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-r8vh6" event={"ID":"b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e","Type":"ContainerDied","Data":"c5603b44452cce4a343f9a17b128acc762e84138c0d01ea722208f869765ecc7"} Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.125796 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.126035 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.126166 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhtmd\" (UniqueName: \"kubernetes.io/projected/96898b41-f590-489d-a5c4-e929344f2a14-kube-api-access-bhtmd\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.126251 4948 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96898b41-f590-489d-a5c4-e929344f2a14-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.152073 4948 scope.go:117] "RemoveContainer" containerID="b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.154436 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.163399 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.172769 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-r8vh6"] Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.190615 4948 scope.go:117] "RemoveContainer" containerID="6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.190727 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-r8vh6"] Dec 04 17:59:04 crc kubenswrapper[4948]: E1204 17:59:04.191079 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7\": container with ID starting with 6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7 not found: ID does not exist" containerID="6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.191108 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7"} err="failed to get container status \"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7\": rpc error: code = NotFound desc = could not find container \"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7\": container with ID starting with 6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7 not found: ID does not exist" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.191127 4948 scope.go:117] "RemoveContainer" containerID="b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639" Dec 04 17:59:04 crc kubenswrapper[4948]: E1204 17:59:04.191418 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639\": container with ID starting with b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639 not found: ID does not exist" containerID="b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.191437 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639"} err="failed to get container status \"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639\": rpc error: code = NotFound desc = could not find container \"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639\": container with ID starting with b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639 not found: ID does not exist" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.191448 4948 scope.go:117] "RemoveContainer" containerID="6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.191650 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7"} err="failed to get container status \"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7\": rpc error: code = NotFound desc = could not find container \"6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7\": container with ID starting with 6fbe78e840fee2c1a2ab92aef90524c238cefd3e80562cdb14a953a9df7649d7 not found: ID does not exist" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.191663 4948 scope.go:117] "RemoveContainer" containerID="b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.192102 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639"} err="failed to get container status \"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639\": rpc error: code = NotFound desc = could not find container \"b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639\": container with ID starting with b1b3981b7d354314ce056e86e616dda259f7894c5070056c52af2f326be08639 not found: ID does not exist" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.192121 4948 scope.go:117] "RemoveContainer" containerID="f2f5428c8fb22126200466715bb9c4edf354bd2a71a3ce55d28b33cc742b21d9" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.200439 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:04 crc kubenswrapper[4948]: E1204 17:59:04.200886 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96898b41-f590-489d-a5c4-e929344f2a14" containerName="nova-metadata-log" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.200903 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="96898b41-f590-489d-a5c4-e929344f2a14" containerName="nova-metadata-log" Dec 04 17:59:04 crc kubenswrapper[4948]: E1204 17:59:04.200932 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" containerName="init" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.200940 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" containerName="init" Dec 04 17:59:04 crc kubenswrapper[4948]: E1204 17:59:04.200959 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c608c956-a885-4f52-8f3c-24e9f5283cb3" containerName="nova-manage" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.200967 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c608c956-a885-4f52-8f3c-24e9f5283cb3" containerName="nova-manage" Dec 04 17:59:04 crc kubenswrapper[4948]: E1204 17:59:04.200983 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" containerName="dnsmasq-dns" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.200991 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" containerName="dnsmasq-dns" Dec 04 17:59:04 crc kubenswrapper[4948]: E1204 17:59:04.201018 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96898b41-f590-489d-a5c4-e929344f2a14" containerName="nova-metadata-metadata" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.201026 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="96898b41-f590-489d-a5c4-e929344f2a14" containerName="nova-metadata-metadata" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.201268 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="96898b41-f590-489d-a5c4-e929344f2a14" containerName="nova-metadata-log" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.201283 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c608c956-a885-4f52-8f3c-24e9f5283cb3" containerName="nova-manage" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.201297 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" containerName="dnsmasq-dns" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.201317 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="96898b41-f590-489d-a5c4-e929344f2a14" containerName="nova-metadata-metadata" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.202265 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.204295 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.205385 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.212165 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.225392 4948 scope.go:117] "RemoveContainer" containerID="a420791b3826131acb476eb5c6cf776be12737be74e1462171b95730c8dcf98a" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.329383 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.329453 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4zn\" (UniqueName: \"kubernetes.io/projected/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-kube-api-access-kf4zn\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.329566 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-config-data\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.329674 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-logs\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.329749 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.431803 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.431870 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.431907 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4zn\" (UniqueName: \"kubernetes.io/projected/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-kube-api-access-kf4zn\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.431935 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-config-data\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.431991 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-logs\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.432339 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-logs\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.436995 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.437280 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.440387 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-config-data\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.458717 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4zn\" (UniqueName: \"kubernetes.io/projected/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-kube-api-access-kf4zn\") pod \"nova-metadata-0\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.525488 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.532602 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.636607 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-config-data\") pod \"ed713933-db04-4fdc-805d-7306d1cf2ec3\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.637270 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnz8\" (UniqueName: \"kubernetes.io/projected/ed713933-db04-4fdc-805d-7306d1cf2ec3-kube-api-access-rvnz8\") pod \"ed713933-db04-4fdc-805d-7306d1cf2ec3\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.637541 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-combined-ca-bundle\") pod \"ed713933-db04-4fdc-805d-7306d1cf2ec3\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.637693 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-scripts\") pod \"ed713933-db04-4fdc-805d-7306d1cf2ec3\" (UID: \"ed713933-db04-4fdc-805d-7306d1cf2ec3\") " Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.653205 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-scripts" (OuterVolumeSpecName: "scripts") pod "ed713933-db04-4fdc-805d-7306d1cf2ec3" (UID: "ed713933-db04-4fdc-805d-7306d1cf2ec3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.662290 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed713933-db04-4fdc-805d-7306d1cf2ec3-kube-api-access-rvnz8" (OuterVolumeSpecName: "kube-api-access-rvnz8") pod "ed713933-db04-4fdc-805d-7306d1cf2ec3" (UID: "ed713933-db04-4fdc-805d-7306d1cf2ec3"). InnerVolumeSpecName "kube-api-access-rvnz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.706433 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-config-data" (OuterVolumeSpecName: "config-data") pod "ed713933-db04-4fdc-805d-7306d1cf2ec3" (UID: "ed713933-db04-4fdc-805d-7306d1cf2ec3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.709449 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed713933-db04-4fdc-805d-7306d1cf2ec3" (UID: "ed713933-db04-4fdc-805d-7306d1cf2ec3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.740367 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.740401 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnz8\" (UniqueName: \"kubernetes.io/projected/ed713933-db04-4fdc-805d-7306d1cf2ec3-kube-api-access-rvnz8\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.740411 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.740419 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed713933-db04-4fdc-805d-7306d1cf2ec3-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.928106 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96898b41-f590-489d-a5c4-e929344f2a14" path="/var/lib/kubelet/pods/96898b41-f590-489d-a5c4-e929344f2a14/volumes" Dec 04 17:59:04 crc kubenswrapper[4948]: I1204 17:59:04.928728 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e" path="/var/lib/kubelet/pods/b8b989d7-73b9-4e45-b5fc-ddc77fa81e6e/volumes" Dec 04 17:59:05 crc kubenswrapper[4948]: W1204 17:59:05.079684 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f7e2c0_3aab_406b_9af6_f21c4088ff70.slice/crio-474676169dfa1849ffb69acb3ca33a7dbef8fcd56107267af6df5d604d44abf4 WatchSource:0}: Error finding container 474676169dfa1849ffb69acb3ca33a7dbef8fcd56107267af6df5d604d44abf4: Status 404 returned error can't find the container with id 474676169dfa1849ffb69acb3ca33a7dbef8fcd56107267af6df5d604d44abf4 Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.080155 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.117401 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" event={"ID":"ed713933-db04-4fdc-805d-7306d1cf2ec3","Type":"ContainerDied","Data":"36e4d51ae69dafd368a7f4bc1bf3568b40e17ef46f606dd406ea91c9e271a8a1"} Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.117429 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-p2zk5" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.117439 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36e4d51ae69dafd368a7f4bc1bf3568b40e17ef46f606dd406ea91c9e271a8a1" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.119030 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f7e2c0-3aab-406b-9af6-f21c4088ff70","Type":"ContainerStarted","Data":"474676169dfa1849ffb69acb3ca33a7dbef8fcd56107267af6df5d604d44abf4"} Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.122846 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" containerName="nova-scheduler-scheduler" containerID="cri-o://4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd" gracePeriod=30 Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.190025 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 17:59:05 crc kubenswrapper[4948]: E1204 17:59:05.196686 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed713933-db04-4fdc-805d-7306d1cf2ec3" containerName="nova-cell1-conductor-db-sync" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.196727 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed713933-db04-4fdc-805d-7306d1cf2ec3" containerName="nova-cell1-conductor-db-sync" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.196960 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed713933-db04-4fdc-805d-7306d1cf2ec3" containerName="nova-cell1-conductor-db-sync" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.197738 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.201524 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.206621 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.250777 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlsqw\" (UniqueName: \"kubernetes.io/projected/e318bac5-87da-4a9b-9d73-8065c65f4b61-kube-api-access-nlsqw\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.250849 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.251208 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.353368 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.353544 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlsqw\" (UniqueName: \"kubernetes.io/projected/e318bac5-87da-4a9b-9d73-8065c65f4b61-kube-api-access-nlsqw\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.353641 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.360744 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.365283 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.376068 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlsqw\" (UniqueName: \"kubernetes.io/projected/e318bac5-87da-4a9b-9d73-8065c65f4b61-kube-api-access-nlsqw\") pod \"nova-cell1-conductor-0\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.529006 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:05 crc kubenswrapper[4948]: I1204 17:59:05.969366 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 17:59:05 crc kubenswrapper[4948]: W1204 17:59:05.977785 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode318bac5_87da_4a9b_9d73_8065c65f4b61.slice/crio-9e35f40a85bf47055c35b75f9d74b4fb908aaf754206e65e733571703469c199 WatchSource:0}: Error finding container 9e35f40a85bf47055c35b75f9d74b4fb908aaf754206e65e733571703469c199: Status 404 returned error can't find the container with id 9e35f40a85bf47055c35b75f9d74b4fb908aaf754206e65e733571703469c199 Dec 04 17:59:06 crc kubenswrapper[4948]: I1204 17:59:06.135004 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f7e2c0-3aab-406b-9af6-f21c4088ff70","Type":"ContainerStarted","Data":"6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408"} Dec 04 17:59:06 crc kubenswrapper[4948]: I1204 17:59:06.135113 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f7e2c0-3aab-406b-9af6-f21c4088ff70","Type":"ContainerStarted","Data":"3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011"} Dec 04 17:59:06 crc kubenswrapper[4948]: I1204 17:59:06.136539 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e318bac5-87da-4a9b-9d73-8065c65f4b61","Type":"ContainerStarted","Data":"9e35f40a85bf47055c35b75f9d74b4fb908aaf754206e65e733571703469c199"} Dec 04 17:59:06 crc kubenswrapper[4948]: I1204 17:59:06.167121 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.167100432 podStartE2EDuration="2.167100432s" podCreationTimestamp="2025-12-04 17:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:06.156714978 +0000 UTC m=+1957.517789390" watchObservedRunningTime="2025-12-04 17:59:06.167100432 +0000 UTC m=+1957.528174834" Dec 04 17:59:07 crc kubenswrapper[4948]: I1204 17:59:07.151209 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e318bac5-87da-4a9b-9d73-8065c65f4b61","Type":"ContainerStarted","Data":"4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467"} Dec 04 17:59:07 crc kubenswrapper[4948]: I1204 17:59:07.151789 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:07 crc kubenswrapper[4948]: I1204 17:59:07.179636 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.179619657 podStartE2EDuration="2.179619657s" podCreationTimestamp="2025-12-04 17:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:07.170503254 +0000 UTC m=+1958.531577656" watchObservedRunningTime="2025-12-04 17:59:07.179619657 +0000 UTC m=+1958.540694059" Dec 04 17:59:07 crc kubenswrapper[4948]: E1204 17:59:07.603682 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 17:59:07 crc kubenswrapper[4948]: E1204 17:59:07.606676 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 17:59:07 crc kubenswrapper[4948]: E1204 17:59:07.608773 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 17:59:07 crc kubenswrapper[4948]: E1204 17:59:07.608837 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" containerName="nova-scheduler-scheduler" Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.742646 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.836227 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq8ws\" (UniqueName: \"kubernetes.io/projected/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-kube-api-access-vq8ws\") pod \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.836442 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-combined-ca-bundle\") pod \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.836496 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-config-data\") pod \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\" (UID: \"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa\") " Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.846627 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-kube-api-access-vq8ws" (OuterVolumeSpecName: "kube-api-access-vq8ws") pod "0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" (UID: "0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa"). InnerVolumeSpecName "kube-api-access-vq8ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.866224 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" (UID: "0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.874853 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-config-data" (OuterVolumeSpecName: "config-data") pod "0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" (UID: "0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.938309 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq8ws\" (UniqueName: \"kubernetes.io/projected/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-kube-api-access-vq8ws\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.938346 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:08 crc kubenswrapper[4948]: I1204 17:59:08.938355 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.175866 4948 generic.go:334] "Generic (PLEG): container finished" podID="0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" containerID="4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd" exitCode=0 Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.177502 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.177499 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa","Type":"ContainerDied","Data":"4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd"} Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.177785 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa","Type":"ContainerDied","Data":"24a56fc1b7247c29730590d1abcfe4c5db4b785831d0a5017b7dd53c2cc7cb28"} Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.177886 4948 scope.go:117] "RemoveContainer" containerID="4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.214193 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.219291 4948 scope.go:117] "RemoveContainer" containerID="4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd" Dec 04 17:59:09 crc kubenswrapper[4948]: E1204 17:59:09.221310 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd\": container with ID starting with 4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd not found: ID does not exist" containerID="4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.221506 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd"} err="failed to get container status \"4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd\": rpc error: code = NotFound desc = could not find container \"4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd\": container with ID starting with 4ebf17a85e7a81467340a49876815714109cf9b7ad54f39a964e51ad5ff7c7dd not found: ID does not exist" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.228170 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.238399 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:09 crc kubenswrapper[4948]: E1204 17:59:09.238861 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" containerName="nova-scheduler-scheduler" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.238884 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" containerName="nova-scheduler-scheduler" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.239125 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" containerName="nova-scheduler-scheduler" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.239775 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.241430 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.249853 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.346708 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-config-data\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.346769 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlf7b\" (UniqueName: \"kubernetes.io/projected/5ef47a31-159f-42c4-a955-b1e833465dd9-kube-api-access-wlf7b\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.346861 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.448856 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-config-data\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.448934 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlf7b\" (UniqueName: \"kubernetes.io/projected/5ef47a31-159f-42c4-a955-b1e833465dd9-kube-api-access-wlf7b\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.449010 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.453604 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-config-data\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.458192 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.502933 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlf7b\" (UniqueName: \"kubernetes.io/projected/5ef47a31-159f-42c4-a955-b1e833465dd9-kube-api-access-wlf7b\") pod \"nova-scheduler-0\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.525960 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.526500 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 17:59:09 crc kubenswrapper[4948]: I1204 17:59:09.562362 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.018324 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:10 crc kubenswrapper[4948]: W1204 17:59:10.020446 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ef47a31_159f_42c4_a955_b1e833465dd9.slice/crio-7f3cd23df1554fcb3d36abfc756cd44622514399510938dcd440e19cfc2d9f94 WatchSource:0}: Error finding container 7f3cd23df1554fcb3d36abfc756cd44622514399510938dcd440e19cfc2d9f94: Status 404 returned error can't find the container with id 7f3cd23df1554fcb3d36abfc756cd44622514399510938dcd440e19cfc2d9f94 Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.056775 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.139743 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.165685 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-config-data\") pod \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.165815 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-combined-ca-bundle\") pod \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.166644 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-logs\") pod \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.166673 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnrxw\" (UniqueName: \"kubernetes.io/projected/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-kube-api-access-vnrxw\") pod \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\" (UID: \"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c\") " Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.168191 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-logs" (OuterVolumeSpecName: "logs") pod "35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" (UID: "35f6eeeb-e7b3-4c57-900d-8e5f944cc25c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.179349 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-kube-api-access-vnrxw" (OuterVolumeSpecName: "kube-api-access-vnrxw") pod "35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" (UID: "35f6eeeb-e7b3-4c57-900d-8e5f944cc25c"). InnerVolumeSpecName "kube-api-access-vnrxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.192690 4948 generic.go:334] "Generic (PLEG): container finished" podID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerID="a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a" exitCode=0 Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.192745 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.192748 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c","Type":"ContainerDied","Data":"a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a"} Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.193197 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"35f6eeeb-e7b3-4c57-900d-8e5f944cc25c","Type":"ContainerDied","Data":"05f065fd9e1b4bd73b40e0d1472bc3b6cf5ed6c65275ed0a21406da09ed7dd4e"} Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.193254 4948 scope.go:117] "RemoveContainer" containerID="a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.198582 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5ef47a31-159f-42c4-a955-b1e833465dd9","Type":"ContainerStarted","Data":"7f3cd23df1554fcb3d36abfc756cd44622514399510938dcd440e19cfc2d9f94"} Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.204649 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" (UID: "35f6eeeb-e7b3-4c57-900d-8e5f944cc25c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.217577 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-config-data" (OuterVolumeSpecName: "config-data") pod "35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" (UID: "35f6eeeb-e7b3-4c57-900d-8e5f944cc25c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.220575 4948 scope.go:117] "RemoveContainer" containerID="4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.240823 4948 scope.go:117] "RemoveContainer" containerID="a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a" Dec 04 17:59:10 crc kubenswrapper[4948]: E1204 17:59:10.241210 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a\": container with ID starting with a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a not found: ID does not exist" containerID="a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.241242 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a"} err="failed to get container status \"a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a\": rpc error: code = NotFound desc = could not find container \"a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a\": container with ID starting with a52d09e355c14dd6820b6b814e6ecdc680fe514b077ba244c5e6a6077611299a not found: ID does not exist" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.241262 4948 scope.go:117] "RemoveContainer" containerID="4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2" Dec 04 17:59:10 crc kubenswrapper[4948]: E1204 17:59:10.241641 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2\": container with ID starting with 4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2 not found: ID does not exist" containerID="4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.241662 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2"} err="failed to get container status \"4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2\": rpc error: code = NotFound desc = could not find container \"4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2\": container with ID starting with 4778849fbb546957037a2e99fe5a7df5a28f49875c3296b0128bbe53f0c862c2 not found: ID does not exist" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.269377 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.269734 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.269760 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.269771 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnrxw\" (UniqueName: \"kubernetes.io/projected/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c-kube-api-access-vnrxw\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.538378 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.550332 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.572597 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:10 crc kubenswrapper[4948]: E1204 17:59:10.573428 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-log" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.573535 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-log" Dec 04 17:59:10 crc kubenswrapper[4948]: E1204 17:59:10.573637 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-api" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.573718 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-api" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.574065 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-log" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.574184 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" containerName="nova-api-api" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.575627 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.578680 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.589219 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.677605 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq7qm\" (UniqueName: \"kubernetes.io/projected/1b4c7195-ff07-4569-8768-c39e686596c9-kube-api-access-cq7qm\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.677912 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4c7195-ff07-4569-8768-c39e686596c9-logs\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.678153 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.678325 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-config-data\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.780772 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq7qm\" (UniqueName: \"kubernetes.io/projected/1b4c7195-ff07-4569-8768-c39e686596c9-kube-api-access-cq7qm\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.781136 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4c7195-ff07-4569-8768-c39e686596c9-logs\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.781407 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.781651 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-config-data\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.782419 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4c7195-ff07-4569-8768-c39e686596c9-logs\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.787302 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-config-data\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.787424 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.810128 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq7qm\" (UniqueName: \"kubernetes.io/projected/1b4c7195-ff07-4569-8768-c39e686596c9-kube-api-access-cq7qm\") pod \"nova-api-0\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.892738 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.934110 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa" path="/var/lib/kubelet/pods/0c7cfcab-2a59-4a2b-8c1d-96f2c40beafa/volumes" Dec 04 17:59:10 crc kubenswrapper[4948]: I1204 17:59:10.934750 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f6eeeb-e7b3-4c57-900d-8e5f944cc25c" path="/var/lib/kubelet/pods/35f6eeeb-e7b3-4c57-900d-8e5f944cc25c/volumes" Dec 04 17:59:11 crc kubenswrapper[4948]: I1204 17:59:11.210489 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5ef47a31-159f-42c4-a955-b1e833465dd9","Type":"ContainerStarted","Data":"1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee"} Dec 04 17:59:11 crc kubenswrapper[4948]: I1204 17:59:11.232951 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.232924384 podStartE2EDuration="2.232924384s" podCreationTimestamp="2025-12-04 17:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:11.226544749 +0000 UTC m=+1962.587619221" watchObservedRunningTime="2025-12-04 17:59:11.232924384 +0000 UTC m=+1962.593998796" Dec 04 17:59:11 crc kubenswrapper[4948]: W1204 17:59:11.339316 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b4c7195_ff07_4569_8768_c39e686596c9.slice/crio-643e7e45caccc3699ff402453b8067b5306f076e42aee5be1cd1b724b9f26cbf WatchSource:0}: Error finding container 643e7e45caccc3699ff402453b8067b5306f076e42aee5be1cd1b724b9f26cbf: Status 404 returned error can't find the container with id 643e7e45caccc3699ff402453b8067b5306f076e42aee5be1cd1b724b9f26cbf Dec 04 17:59:11 crc kubenswrapper[4948]: I1204 17:59:11.357963 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:12 crc kubenswrapper[4948]: I1204 17:59:12.220459 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b4c7195-ff07-4569-8768-c39e686596c9","Type":"ContainerStarted","Data":"4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d"} Dec 04 17:59:12 crc kubenswrapper[4948]: I1204 17:59:12.221166 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b4c7195-ff07-4569-8768-c39e686596c9","Type":"ContainerStarted","Data":"86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad"} Dec 04 17:59:12 crc kubenswrapper[4948]: I1204 17:59:12.221195 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b4c7195-ff07-4569-8768-c39e686596c9","Type":"ContainerStarted","Data":"643e7e45caccc3699ff402453b8067b5306f076e42aee5be1cd1b724b9f26cbf"} Dec 04 17:59:12 crc kubenswrapper[4948]: I1204 17:59:12.241448 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.24142724 podStartE2EDuration="2.24142724s" podCreationTimestamp="2025-12-04 17:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:12.234981903 +0000 UTC m=+1963.596056305" watchObservedRunningTime="2025-12-04 17:59:12.24142724 +0000 UTC m=+1963.602501642" Dec 04 17:59:13 crc kubenswrapper[4948]: I1204 17:59:13.832112 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:59:13 crc kubenswrapper[4948]: I1204 17:59:13.832372 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8e901b75-2ae5-4a6a-b958-4c924edc4189" containerName="kube-state-metrics" containerID="cri-o://4ac93f5a77955edb05b13ac37f9560428bc365628cca00e1159d8b0b5f24352d" gracePeriod=30 Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.242300 4948 generic.go:334] "Generic (PLEG): container finished" podID="8e901b75-2ae5-4a6a-b958-4c924edc4189" containerID="4ac93f5a77955edb05b13ac37f9560428bc365628cca00e1159d8b0b5f24352d" exitCode=2 Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.242640 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e901b75-2ae5-4a6a-b958-4c924edc4189","Type":"ContainerDied","Data":"4ac93f5a77955edb05b13ac37f9560428bc365628cca00e1159d8b0b5f24352d"} Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.344939 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.448993 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whc4z\" (UniqueName: \"kubernetes.io/projected/8e901b75-2ae5-4a6a-b958-4c924edc4189-kube-api-access-whc4z\") pod \"8e901b75-2ae5-4a6a-b958-4c924edc4189\" (UID: \"8e901b75-2ae5-4a6a-b958-4c924edc4189\") " Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.454395 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e901b75-2ae5-4a6a-b958-4c924edc4189-kube-api-access-whc4z" (OuterVolumeSpecName: "kube-api-access-whc4z") pod "8e901b75-2ae5-4a6a-b958-4c924edc4189" (UID: "8e901b75-2ae5-4a6a-b958-4c924edc4189"). InnerVolumeSpecName "kube-api-access-whc4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.526267 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.526319 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.552429 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whc4z\" (UniqueName: \"kubernetes.io/projected/8e901b75-2ae5-4a6a-b958-4c924edc4189-kube-api-access-whc4z\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:14 crc kubenswrapper[4948]: I1204 17:59:14.562714 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.255199 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8e901b75-2ae5-4a6a-b958-4c924edc4189","Type":"ContainerDied","Data":"2c5f8b3adf7be13cc91b7d0a19176dd2cdc680846db75ed5aa265cf70a20cbc9"} Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.255279 4948 scope.go:117] "RemoveContainer" containerID="4ac93f5a77955edb05b13ac37f9560428bc365628cca00e1159d8b0b5f24352d" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.255475 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.319835 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.347487 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.358523 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:59:15 crc kubenswrapper[4948]: E1204 17:59:15.358893 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e901b75-2ae5-4a6a-b958-4c924edc4189" containerName="kube-state-metrics" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.358910 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e901b75-2ae5-4a6a-b958-4c924edc4189" containerName="kube-state-metrics" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.359111 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e901b75-2ae5-4a6a-b958-4c924edc4189" containerName="kube-state-metrics" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.359853 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.362305 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.363128 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.367719 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.469432 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.469561 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdnk\" (UniqueName: \"kubernetes.io/projected/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-api-access-szdnk\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.469634 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.469693 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.489464 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.490033 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="ceilometer-central-agent" containerID="cri-o://5f98cd6378510fbf07495b70d5dd6ac6dfb9412e84148157e5fea6706430acf5" gracePeriod=30 Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.490101 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="sg-core" containerID="cri-o://b0d8cedf17b3bdea647b1eaa2b97547299327f395dded493f35e180440142a2c" gracePeriod=30 Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.490183 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="ceilometer-notification-agent" containerID="cri-o://e12f22149b667115048940a93e4c08168a5eed7b4451bd6cc55a9ed2dac86031" gracePeriod=30 Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.490120 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="proxy-httpd" containerID="cri-o://0b0d4c57d69658e7e6ba20c11a9023d6e456f46d1183ca77fdd5aa0bb0a78b1a" gracePeriod=30 Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.538205 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.538205 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.561233 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.571231 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szdnk\" (UniqueName: \"kubernetes.io/projected/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-api-access-szdnk\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.571301 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.571341 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.571404 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.577220 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.577943 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.582440 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.599818 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdnk\" (UniqueName: \"kubernetes.io/projected/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-api-access-szdnk\") pod \"kube-state-metrics-0\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " pod="openstack/kube-state-metrics-0" Dec 04 17:59:15 crc kubenswrapper[4948]: I1204 17:59:15.679641 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.264972 4948 generic.go:334] "Generic (PLEG): container finished" podID="3fcef507-b266-492a-8877-f773828b5b0f" containerID="0b0d4c57d69658e7e6ba20c11a9023d6e456f46d1183ca77fdd5aa0bb0a78b1a" exitCode=0 Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.265360 4948 generic.go:334] "Generic (PLEG): container finished" podID="3fcef507-b266-492a-8877-f773828b5b0f" containerID="b0d8cedf17b3bdea647b1eaa2b97547299327f395dded493f35e180440142a2c" exitCode=2 Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.265373 4948 generic.go:334] "Generic (PLEG): container finished" podID="3fcef507-b266-492a-8877-f773828b5b0f" containerID="5f98cd6378510fbf07495b70d5dd6ac6dfb9412e84148157e5fea6706430acf5" exitCode=0 Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.265020 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerDied","Data":"0b0d4c57d69658e7e6ba20c11a9023d6e456f46d1183ca77fdd5aa0bb0a78b1a"} Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.265473 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerDied","Data":"b0d8cedf17b3bdea647b1eaa2b97547299327f395dded493f35e180440142a2c"} Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.265495 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerDied","Data":"5f98cd6378510fbf07495b70d5dd6ac6dfb9412e84148157e5fea6706430acf5"} Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.307824 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.309426 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 17:59:16 crc kubenswrapper[4948]: I1204 17:59:16.926465 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e901b75-2ae5-4a6a-b958-4c924edc4189" path="/var/lib/kubelet/pods/8e901b75-2ae5-4a6a-b958-4c924edc4189/volumes" Dec 04 17:59:17 crc kubenswrapper[4948]: I1204 17:59:17.278126 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc","Type":"ContainerStarted","Data":"bf58486ad5fb8f9b6caf20551fac4a932d36fd2cc04ef2bf024f6aa264c91c35"} Dec 04 17:59:17 crc kubenswrapper[4948]: I1204 17:59:17.278836 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc","Type":"ContainerStarted","Data":"00936facc0bffe2a77c58d42db5363bccdb3d50e0aa8002ebd274826443e9d1a"} Dec 04 17:59:17 crc kubenswrapper[4948]: I1204 17:59:17.278999 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 17:59:17 crc kubenswrapper[4948]: I1204 17:59:17.295030 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8513664090000002 podStartE2EDuration="2.295004234s" podCreationTimestamp="2025-12-04 17:59:15 +0000 UTC" firstStartedPulling="2025-12-04 17:59:16.30873769 +0000 UTC m=+1967.669812092" lastFinishedPulling="2025-12-04 17:59:16.752375515 +0000 UTC m=+1968.113449917" observedRunningTime="2025-12-04 17:59:17.292208456 +0000 UTC m=+1968.653282858" watchObservedRunningTime="2025-12-04 17:59:17.295004234 +0000 UTC m=+1968.656078646" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.289705 4948 generic.go:334] "Generic (PLEG): container finished" podID="3fcef507-b266-492a-8877-f773828b5b0f" containerID="e12f22149b667115048940a93e4c08168a5eed7b4451bd6cc55a9ed2dac86031" exitCode=0 Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.289944 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerDied","Data":"e12f22149b667115048940a93e4c08168a5eed7b4451bd6cc55a9ed2dac86031"} Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.623992 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.824339 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-config-data\") pod \"3fcef507-b266-492a-8877-f773828b5b0f\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.824399 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-combined-ca-bundle\") pod \"3fcef507-b266-492a-8877-f773828b5b0f\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.824433 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-sg-core-conf-yaml\") pod \"3fcef507-b266-492a-8877-f773828b5b0f\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.824470 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-log-httpd\") pod \"3fcef507-b266-492a-8877-f773828b5b0f\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.824671 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-scripts\") pod \"3fcef507-b266-492a-8877-f773828b5b0f\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.824745 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-run-httpd\") pod \"3fcef507-b266-492a-8877-f773828b5b0f\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.824770 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjc56\" (UniqueName: \"kubernetes.io/projected/3fcef507-b266-492a-8877-f773828b5b0f-kube-api-access-fjc56\") pod \"3fcef507-b266-492a-8877-f773828b5b0f\" (UID: \"3fcef507-b266-492a-8877-f773828b5b0f\") " Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.829848 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3fcef507-b266-492a-8877-f773828b5b0f" (UID: "3fcef507-b266-492a-8877-f773828b5b0f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.830123 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3fcef507-b266-492a-8877-f773828b5b0f" (UID: "3fcef507-b266-492a-8877-f773828b5b0f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.830242 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fcef507-b266-492a-8877-f773828b5b0f-kube-api-access-fjc56" (OuterVolumeSpecName: "kube-api-access-fjc56") pod "3fcef507-b266-492a-8877-f773828b5b0f" (UID: "3fcef507-b266-492a-8877-f773828b5b0f"). InnerVolumeSpecName "kube-api-access-fjc56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.830945 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-scripts" (OuterVolumeSpecName: "scripts") pod "3fcef507-b266-492a-8877-f773828b5b0f" (UID: "3fcef507-b266-492a-8877-f773828b5b0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.870528 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3fcef507-b266-492a-8877-f773828b5b0f" (UID: "3fcef507-b266-492a-8877-f773828b5b0f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.904213 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fcef507-b266-492a-8877-f773828b5b0f" (UID: "3fcef507-b266-492a-8877-f773828b5b0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.942218 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.942258 4948 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.942268 4948 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.942277 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.942291 4948 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fcef507-b266-492a-8877-f773828b5b0f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.942301 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjc56\" (UniqueName: \"kubernetes.io/projected/3fcef507-b266-492a-8877-f773828b5b0f-kube-api-access-fjc56\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:18 crc kubenswrapper[4948]: I1204 17:59:18.947620 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-config-data" (OuterVolumeSpecName: "config-data") pod "3fcef507-b266-492a-8877-f773828b5b0f" (UID: "3fcef507-b266-492a-8877-f773828b5b0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.043884 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcef507-b266-492a-8877-f773828b5b0f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.300604 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.300589 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fcef507-b266-492a-8877-f773828b5b0f","Type":"ContainerDied","Data":"b3b7da2ace18f4c86a3c1719035e0e96e88afeb665af336abfa4ab80e3a51508"} Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.300691 4948 scope.go:117] "RemoveContainer" containerID="0b0d4c57d69658e7e6ba20c11a9023d6e456f46d1183ca77fdd5aa0bb0a78b1a" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.340264 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.340660 4948 scope.go:117] "RemoveContainer" containerID="b0d8cedf17b3bdea647b1eaa2b97547299327f395dded493f35e180440142a2c" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.349334 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.366207 4948 scope.go:117] "RemoveContainer" containerID="e12f22149b667115048940a93e4c08168a5eed7b4451bd6cc55a9ed2dac86031" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.368667 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:19 crc kubenswrapper[4948]: E1204 17:59:19.369138 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="sg-core" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.369163 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="sg-core" Dec 04 17:59:19 crc kubenswrapper[4948]: E1204 17:59:19.369185 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="proxy-httpd" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.369194 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="proxy-httpd" Dec 04 17:59:19 crc kubenswrapper[4948]: E1204 17:59:19.369210 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="ceilometer-notification-agent" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.369220 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="ceilometer-notification-agent" Dec 04 17:59:19 crc kubenswrapper[4948]: E1204 17:59:19.369258 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="ceilometer-central-agent" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.369267 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="ceilometer-central-agent" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.369529 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="proxy-httpd" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.369561 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="ceilometer-central-agent" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.369590 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="ceilometer-notification-agent" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.369604 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fcef507-b266-492a-8877-f773828b5b0f" containerName="sg-core" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.371692 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.388893 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.393301 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.395828 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.396278 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.400212 4948 scope.go:117] "RemoveContainer" containerID="5f98cd6378510fbf07495b70d5dd6ac6dfb9412e84148157e5fea6706430acf5" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.553611 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.553820 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-scripts\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.553899 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-config-data\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.553966 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.554085 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf7ng\" (UniqueName: \"kubernetes.io/projected/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-kube-api-access-gf7ng\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.554144 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-run-httpd\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.554200 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-log-httpd\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.554329 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.563506 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.606497 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.655945 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.656033 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-scripts\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.656081 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-config-data\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.656113 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.656161 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf7ng\" (UniqueName: \"kubernetes.io/projected/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-kube-api-access-gf7ng\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.656194 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-run-httpd\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.656221 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-log-httpd\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.656286 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.656834 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-log-httpd\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.657122 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-run-httpd\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.660002 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-scripts\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.660136 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.661944 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-config-data\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.662759 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.664233 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.674582 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf7ng\" (UniqueName: \"kubernetes.io/projected/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-kube-api-access-gf7ng\") pod \"ceilometer-0\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " pod="openstack/ceilometer-0" Dec 04 17:59:19 crc kubenswrapper[4948]: I1204 17:59:19.689216 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:59:20 crc kubenswrapper[4948]: I1204 17:59:20.162095 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:20 crc kubenswrapper[4948]: W1204 17:59:20.169415 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f58d8e7_9c67_4717_bdb8_5efed102c7f9.slice/crio-dbb8b6bbb7fb637ef9348758fa00d0b14cd87bd96d36574ce8b2e349ee736a8f WatchSource:0}: Error finding container dbb8b6bbb7fb637ef9348758fa00d0b14cd87bd96d36574ce8b2e349ee736a8f: Status 404 returned error can't find the container with id dbb8b6bbb7fb637ef9348758fa00d0b14cd87bd96d36574ce8b2e349ee736a8f Dec 04 17:59:20 crc kubenswrapper[4948]: I1204 17:59:20.310164 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerStarted","Data":"dbb8b6bbb7fb637ef9348758fa00d0b14cd87bd96d36574ce8b2e349ee736a8f"} Dec 04 17:59:20 crc kubenswrapper[4948]: I1204 17:59:20.348622 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 17:59:20 crc kubenswrapper[4948]: I1204 17:59:20.893432 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 17:59:20 crc kubenswrapper[4948]: I1204 17:59:20.893784 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 17:59:20 crc kubenswrapper[4948]: I1204 17:59:20.928354 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fcef507-b266-492a-8877-f773828b5b0f" path="/var/lib/kubelet/pods/3fcef507-b266-492a-8877-f773828b5b0f/volumes" Dec 04 17:59:21 crc kubenswrapper[4948]: I1204 17:59:21.321598 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerStarted","Data":"28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa"} Dec 04 17:59:21 crc kubenswrapper[4948]: I1204 17:59:21.976330 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 17:59:21 crc kubenswrapper[4948]: I1204 17:59:21.976374 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 17:59:22 crc kubenswrapper[4948]: I1204 17:59:22.334301 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerStarted","Data":"f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a"} Dec 04 17:59:23 crc kubenswrapper[4948]: I1204 17:59:23.345136 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerStarted","Data":"c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734"} Dec 04 17:59:24 crc kubenswrapper[4948]: I1204 17:59:24.358148 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerStarted","Data":"506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce"} Dec 04 17:59:24 crc kubenswrapper[4948]: I1204 17:59:24.358514 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 17:59:24 crc kubenswrapper[4948]: I1204 17:59:24.389501 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.5651473839999999 podStartE2EDuration="5.389479035s" podCreationTimestamp="2025-12-04 17:59:19 +0000 UTC" firstStartedPulling="2025-12-04 17:59:20.171119269 +0000 UTC m=+1971.532193671" lastFinishedPulling="2025-12-04 17:59:23.99545088 +0000 UTC m=+1975.356525322" observedRunningTime="2025-12-04 17:59:24.381222343 +0000 UTC m=+1975.742296765" watchObservedRunningTime="2025-12-04 17:59:24.389479035 +0000 UTC m=+1975.750553427" Dec 04 17:59:24 crc kubenswrapper[4948]: I1204 17:59:24.530810 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 17:59:24 crc kubenswrapper[4948]: I1204 17:59:24.532632 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 17:59:24 crc kubenswrapper[4948]: I1204 17:59:24.537549 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 17:59:25 crc kubenswrapper[4948]: I1204 17:59:25.376614 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 17:59:25 crc kubenswrapper[4948]: I1204 17:59:25.694728 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.388439 4948 generic.go:334] "Generic (PLEG): container finished" podID="36484b96-9927-4fae-b2b1-95c5bf766b21" containerID="9ae407ba7abe25be3566329a0b4d24921bd780cb2e8daa33169ca790f719ed14" exitCode=137 Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.389155 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"36484b96-9927-4fae-b2b1-95c5bf766b21","Type":"ContainerDied","Data":"9ae407ba7abe25be3566329a0b4d24921bd780cb2e8daa33169ca790f719ed14"} Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.548524 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.624059 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-config-data\") pod \"36484b96-9927-4fae-b2b1-95c5bf766b21\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.624214 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kmz5\" (UniqueName: \"kubernetes.io/projected/36484b96-9927-4fae-b2b1-95c5bf766b21-kube-api-access-4kmz5\") pod \"36484b96-9927-4fae-b2b1-95c5bf766b21\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.624285 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-combined-ca-bundle\") pod \"36484b96-9927-4fae-b2b1-95c5bf766b21\" (UID: \"36484b96-9927-4fae-b2b1-95c5bf766b21\") " Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.629589 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36484b96-9927-4fae-b2b1-95c5bf766b21-kube-api-access-4kmz5" (OuterVolumeSpecName: "kube-api-access-4kmz5") pod "36484b96-9927-4fae-b2b1-95c5bf766b21" (UID: "36484b96-9927-4fae-b2b1-95c5bf766b21"). InnerVolumeSpecName "kube-api-access-4kmz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.649647 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36484b96-9927-4fae-b2b1-95c5bf766b21" (UID: "36484b96-9927-4fae-b2b1-95c5bf766b21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.657604 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-config-data" (OuterVolumeSpecName: "config-data") pod "36484b96-9927-4fae-b2b1-95c5bf766b21" (UID: "36484b96-9927-4fae-b2b1-95c5bf766b21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.726448 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.726488 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kmz5\" (UniqueName: \"kubernetes.io/projected/36484b96-9927-4fae-b2b1-95c5bf766b21-kube-api-access-4kmz5\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:27 crc kubenswrapper[4948]: I1204 17:59:27.726499 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36484b96-9927-4fae-b2b1-95c5bf766b21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.399566 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"36484b96-9927-4fae-b2b1-95c5bf766b21","Type":"ContainerDied","Data":"cb67c6fac30128127e161596ee8dcfcd8fe5a9dd5beb1f2c2cf028b0e6bfb4dd"} Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.399897 4948 scope.go:117] "RemoveContainer" containerID="9ae407ba7abe25be3566329a0b4d24921bd780cb2e8daa33169ca790f719ed14" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.400097 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.470345 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.478108 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.494992 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:59:28 crc kubenswrapper[4948]: E1204 17:59:28.495514 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36484b96-9927-4fae-b2b1-95c5bf766b21" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.495550 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="36484b96-9927-4fae-b2b1-95c5bf766b21" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.495833 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="36484b96-9927-4fae-b2b1-95c5bf766b21" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.496618 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.499561 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.501834 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.502128 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.502683 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.541758 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpgg9\" (UniqueName: \"kubernetes.io/projected/6458efcd-4f47-46a1-92ab-3f1c77035cce-kube-api-access-hpgg9\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.541825 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.541893 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.541944 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.541966 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.642958 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpgg9\" (UniqueName: \"kubernetes.io/projected/6458efcd-4f47-46a1-92ab-3f1c77035cce-kube-api-access-hpgg9\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.643009 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.643060 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.643104 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.643120 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.647650 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.648529 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.649638 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.653316 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.673434 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpgg9\" (UniqueName: \"kubernetes.io/projected/6458efcd-4f47-46a1-92ab-3f1c77035cce-kube-api-access-hpgg9\") pod \"nova-cell1-novncproxy-0\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.819762 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:28 crc kubenswrapper[4948]: I1204 17:59:28.937028 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36484b96-9927-4fae-b2b1-95c5bf766b21" path="/var/lib/kubelet/pods/36484b96-9927-4fae-b2b1-95c5bf766b21/volumes" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.291612 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 17:59:29 crc kubenswrapper[4948]: W1204 17:59:29.291861 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6458efcd_4f47_46a1_92ab_3f1c77035cce.slice/crio-61521069d25143e3b65a5c27570a323145f5f615e05b0c63d50d32acb4997549 WatchSource:0}: Error finding container 61521069d25143e3b65a5c27570a323145f5f615e05b0c63d50d32acb4997549: Status 404 returned error can't find the container with id 61521069d25143e3b65a5c27570a323145f5f615e05b0c63d50d32acb4997549 Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.409533 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6458efcd-4f47-46a1-92ab-3f1c77035cce","Type":"ContainerStarted","Data":"61521069d25143e3b65a5c27570a323145f5f615e05b0c63d50d32acb4997549"} Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.727405 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vv4h6"] Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.731007 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.752927 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vv4h6"] Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.804322 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x429r\" (UniqueName: \"kubernetes.io/projected/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-kube-api-access-x429r\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.804473 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-catalog-content\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.804581 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-utilities\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.906962 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x429r\" (UniqueName: \"kubernetes.io/projected/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-kube-api-access-x429r\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.907031 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-catalog-content\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.907084 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-utilities\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.907609 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-utilities\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.907751 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-catalog-content\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:29 crc kubenswrapper[4948]: I1204 17:59:29.926071 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x429r\" (UniqueName: \"kubernetes.io/projected/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-kube-api-access-x429r\") pod \"certified-operators-vv4h6\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:30 crc kubenswrapper[4948]: I1204 17:59:30.051416 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:30 crc kubenswrapper[4948]: I1204 17:59:30.433005 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6458efcd-4f47-46a1-92ab-3f1c77035cce","Type":"ContainerStarted","Data":"81a88ee925738bec54ae478b0412037a10331c49f88928f7e3a4b1b1fb31441f"} Dec 04 17:59:30 crc kubenswrapper[4948]: I1204 17:59:30.455792 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.455770678 podStartE2EDuration="2.455770678s" podCreationTimestamp="2025-12-04 17:59:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:30.44953204 +0000 UTC m=+1981.810606442" watchObservedRunningTime="2025-12-04 17:59:30.455770678 +0000 UTC m=+1981.816845080" Dec 04 17:59:30 crc kubenswrapper[4948]: I1204 17:59:30.596447 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vv4h6"] Dec 04 17:59:30 crc kubenswrapper[4948]: I1204 17:59:30.899125 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 17:59:30 crc kubenswrapper[4948]: I1204 17:59:30.899719 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 17:59:30 crc kubenswrapper[4948]: I1204 17:59:30.899765 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 17:59:30 crc kubenswrapper[4948]: I1204 17:59:30.901994 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.445439 4948 generic.go:334] "Generic (PLEG): container finished" podID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerID="cf66e9bcd5504a9e28d38570957b397f95bdc1f64120bad681fe6e3dec324426" exitCode=0 Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.445550 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vv4h6" event={"ID":"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8","Type":"ContainerDied","Data":"cf66e9bcd5504a9e28d38570957b397f95bdc1f64120bad681fe6e3dec324426"} Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.445575 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vv4h6" event={"ID":"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8","Type":"ContainerStarted","Data":"ec5cde05c72fc4e863308c1b66d076069ba84b558a01078071eb2400afb15397"} Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.446337 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.450367 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.672842 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-hnjl5"] Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.674868 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.702940 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-hnjl5"] Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.741559 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-config\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.741614 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766q6\" (UniqueName: \"kubernetes.io/projected/3326569d-4475-4365-8d93-b2b1522b6f60-kube-api-access-766q6\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.741657 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.741912 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.741990 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.742121 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.844149 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.844219 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.844253 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-config\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.844281 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-766q6\" (UniqueName: \"kubernetes.io/projected/3326569d-4475-4365-8d93-b2b1522b6f60-kube-api-access-766q6\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.844315 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.844385 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.845208 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.846367 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.846432 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.846948 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.847383 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-config\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:31 crc kubenswrapper[4948]: I1204 17:59:31.863948 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-766q6\" (UniqueName: \"kubernetes.io/projected/3326569d-4475-4365-8d93-b2b1522b6f60-kube-api-access-766q6\") pod \"dnsmasq-dns-cd5cbd7b9-hnjl5\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:32 crc kubenswrapper[4948]: I1204 17:59:31.999924 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:32 crc kubenswrapper[4948]: I1204 17:59:32.520155 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-hnjl5"] Dec 04 17:59:33 crc kubenswrapper[4948]: I1204 17:59:33.471120 4948 generic.go:334] "Generic (PLEG): container finished" podID="3326569d-4475-4365-8d93-b2b1522b6f60" containerID="2e82af8d0ee65dbf61f13fbc7e3f43e88954a4a8f887ae20e99e967abdfd0b62" exitCode=0 Dec 04 17:59:33 crc kubenswrapper[4948]: I1204 17:59:33.471224 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" event={"ID":"3326569d-4475-4365-8d93-b2b1522b6f60","Type":"ContainerDied","Data":"2e82af8d0ee65dbf61f13fbc7e3f43e88954a4a8f887ae20e99e967abdfd0b62"} Dec 04 17:59:33 crc kubenswrapper[4948]: I1204 17:59:33.471435 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" event={"ID":"3326569d-4475-4365-8d93-b2b1522b6f60","Type":"ContainerStarted","Data":"0906b04e9303e652de258fded09bd6b6ebff496fc77ca9796ec94bb6c580e83b"} Dec 04 17:59:33 crc kubenswrapper[4948]: I1204 17:59:33.820983 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.256661 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.280114 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.280387 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="ceilometer-central-agent" containerID="cri-o://28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa" gracePeriod=30 Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.280466 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="ceilometer-notification-agent" containerID="cri-o://f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a" gracePeriod=30 Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.280496 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="sg-core" containerID="cri-o://c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734" gracePeriod=30 Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.280592 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="proxy-httpd" containerID="cri-o://506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce" gracePeriod=30 Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.289287 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.196:3000/\": read tcp 10.217.0.2:40888->10.217.0.196:3000: read: connection reset by peer" Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.491281 4948 generic.go:334] "Generic (PLEG): container finished" podID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerID="c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734" exitCode=2 Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.491372 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerDied","Data":"c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734"} Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.496346 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" event={"ID":"3326569d-4475-4365-8d93-b2b1522b6f60","Type":"ContainerStarted","Data":"c91a77902e7ba5e05adfd3330e1a213f391e817ac078428b5350fe1e14dbe94b"} Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.496486 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.496685 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-log" containerID="cri-o://86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad" gracePeriod=30 Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.497019 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-api" containerID="cri-o://4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d" gracePeriod=30 Dec 04 17:59:34 crc kubenswrapper[4948]: I1204 17:59:34.524303 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" podStartSLOduration=3.524284549 podStartE2EDuration="3.524284549s" podCreationTimestamp="2025-12-04 17:59:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:34.517919587 +0000 UTC m=+1985.878994009" watchObservedRunningTime="2025-12-04 17:59:34.524284549 +0000 UTC m=+1985.885358951" Dec 04 17:59:35 crc kubenswrapper[4948]: I1204 17:59:35.524230 4948 generic.go:334] "Generic (PLEG): container finished" podID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerID="506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce" exitCode=0 Dec 04 17:59:35 crc kubenswrapper[4948]: I1204 17:59:35.524579 4948 generic.go:334] "Generic (PLEG): container finished" podID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerID="28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa" exitCode=0 Dec 04 17:59:35 crc kubenswrapper[4948]: I1204 17:59:35.524374 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerDied","Data":"506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce"} Dec 04 17:59:35 crc kubenswrapper[4948]: I1204 17:59:35.524627 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerDied","Data":"28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa"} Dec 04 17:59:35 crc kubenswrapper[4948]: I1204 17:59:35.527214 4948 generic.go:334] "Generic (PLEG): container finished" podID="1b4c7195-ff07-4569-8768-c39e686596c9" containerID="86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad" exitCode=143 Dec 04 17:59:35 crc kubenswrapper[4948]: I1204 17:59:35.527286 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b4c7195-ff07-4569-8768-c39e686596c9","Type":"ContainerDied","Data":"86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad"} Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.161975 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.175601 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.282737 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-ceilometer-tls-certs\") pod \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.282782 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-config-data\") pod \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.282817 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf7ng\" (UniqueName: \"kubernetes.io/projected/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-kube-api-access-gf7ng\") pod \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.282854 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-combined-ca-bundle\") pod \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.282878 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-scripts\") pod \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.282937 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4c7195-ff07-4569-8768-c39e686596c9-logs\") pod \"1b4c7195-ff07-4569-8768-c39e686596c9\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.282982 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq7qm\" (UniqueName: \"kubernetes.io/projected/1b4c7195-ff07-4569-8768-c39e686596c9-kube-api-access-cq7qm\") pod \"1b4c7195-ff07-4569-8768-c39e686596c9\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.283001 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-config-data\") pod \"1b4c7195-ff07-4569-8768-c39e686596c9\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.283092 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-sg-core-conf-yaml\") pod \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.283213 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-log-httpd\") pod \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.283251 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-combined-ca-bundle\") pod \"1b4c7195-ff07-4569-8768-c39e686596c9\" (UID: \"1b4c7195-ff07-4569-8768-c39e686596c9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.283278 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-run-httpd\") pod \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\" (UID: \"3f58d8e7-9c67-4717-bdb8-5efed102c7f9\") " Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.283516 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b4c7195-ff07-4569-8768-c39e686596c9-logs" (OuterVolumeSpecName: "logs") pod "1b4c7195-ff07-4569-8768-c39e686596c9" (UID: "1b4c7195-ff07-4569-8768-c39e686596c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.283826 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3f58d8e7-9c67-4717-bdb8-5efed102c7f9" (UID: "3f58d8e7-9c67-4717-bdb8-5efed102c7f9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.283985 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3f58d8e7-9c67-4717-bdb8-5efed102c7f9" (UID: "3f58d8e7-9c67-4717-bdb8-5efed102c7f9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.284088 4948 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.284110 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b4c7195-ff07-4569-8768-c39e686596c9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.289234 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-kube-api-access-gf7ng" (OuterVolumeSpecName: "kube-api-access-gf7ng") pod "3f58d8e7-9c67-4717-bdb8-5efed102c7f9" (UID: "3f58d8e7-9c67-4717-bdb8-5efed102c7f9"). InnerVolumeSpecName "kube-api-access-gf7ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.293324 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4c7195-ff07-4569-8768-c39e686596c9-kube-api-access-cq7qm" (OuterVolumeSpecName: "kube-api-access-cq7qm") pod "1b4c7195-ff07-4569-8768-c39e686596c9" (UID: "1b4c7195-ff07-4569-8768-c39e686596c9"). InnerVolumeSpecName "kube-api-access-cq7qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.294692 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-scripts" (OuterVolumeSpecName: "scripts") pod "3f58d8e7-9c67-4717-bdb8-5efed102c7f9" (UID: "3f58d8e7-9c67-4717-bdb8-5efed102c7f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.316429 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3f58d8e7-9c67-4717-bdb8-5efed102c7f9" (UID: "3f58d8e7-9c67-4717-bdb8-5efed102c7f9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.342128 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-config-data" (OuterVolumeSpecName: "config-data") pod "1b4c7195-ff07-4569-8768-c39e686596c9" (UID: "1b4c7195-ff07-4569-8768-c39e686596c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.362594 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b4c7195-ff07-4569-8768-c39e686596c9" (UID: "1b4c7195-ff07-4569-8768-c39e686596c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.364446 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3f58d8e7-9c67-4717-bdb8-5efed102c7f9" (UID: "3f58d8e7-9c67-4717-bdb8-5efed102c7f9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.385220 4948 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.385247 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.385258 4948 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.385266 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf7ng\" (UniqueName: \"kubernetes.io/projected/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-kube-api-access-gf7ng\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.385274 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.385282 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq7qm\" (UniqueName: \"kubernetes.io/projected/1b4c7195-ff07-4569-8768-c39e686596c9-kube-api-access-cq7qm\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.385291 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4c7195-ff07-4569-8768-c39e686596c9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.385300 4948 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.397787 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f58d8e7-9c67-4717-bdb8-5efed102c7f9" (UID: "3f58d8e7-9c67-4717-bdb8-5efed102c7f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.410404 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-config-data" (OuterVolumeSpecName: "config-data") pod "3f58d8e7-9c67-4717-bdb8-5efed102c7f9" (UID: "3f58d8e7-9c67-4717-bdb8-5efed102c7f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.486956 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.486995 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f58d8e7-9c67-4717-bdb8-5efed102c7f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.575787 4948 generic.go:334] "Generic (PLEG): container finished" podID="1b4c7195-ff07-4569-8768-c39e686596c9" containerID="4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d" exitCode=0 Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.575848 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b4c7195-ff07-4569-8768-c39e686596c9","Type":"ContainerDied","Data":"4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d"} Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.575874 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1b4c7195-ff07-4569-8768-c39e686596c9","Type":"ContainerDied","Data":"643e7e45caccc3699ff402453b8067b5306f076e42aee5be1cd1b724b9f26cbf"} Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.575890 4948 scope.go:117] "RemoveContainer" containerID="4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.576008 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.581913 4948 generic.go:334] "Generic (PLEG): container finished" podID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerID="f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a" exitCode=0 Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.582012 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerDied","Data":"f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a"} Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.582072 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3f58d8e7-9c67-4717-bdb8-5efed102c7f9","Type":"ContainerDied","Data":"dbb8b6bbb7fb637ef9348758fa00d0b14cd87bd96d36574ce8b2e349ee736a8f"} Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.582159 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.590267 4948 generic.go:334] "Generic (PLEG): container finished" podID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerID="94c6a4fa8ad8e6c07e199e78f5b35e01ea920cd56249614d724c43d28cca129b" exitCode=0 Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.590304 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vv4h6" event={"ID":"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8","Type":"ContainerDied","Data":"94c6a4fa8ad8e6c07e199e78f5b35e01ea920cd56249614d724c43d28cca129b"} Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.620356 4948 scope.go:117] "RemoveContainer" containerID="86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.652091 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.652924 4948 scope.go:117] "RemoveContainer" containerID="4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.656122 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d\": container with ID starting with 4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d not found: ID does not exist" containerID="4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.656193 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d"} err="failed to get container status \"4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d\": rpc error: code = NotFound desc = could not find container \"4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d\": container with ID starting with 4d95b21da4e83258da20cc4b535d887c609bffbc5b129e3d41dd4faf92d73f5d not found: ID does not exist" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.656445 4948 scope.go:117] "RemoveContainer" containerID="86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.660306 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad\": container with ID starting with 86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad not found: ID does not exist" containerID="86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.660338 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad"} err="failed to get container status \"86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad\": rpc error: code = NotFound desc = could not find container \"86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad\": container with ID starting with 86ae0a7e0bfbf2e668cfb0da56b5142a1011ba8544e3babaf7e63b94b34180ad not found: ID does not exist" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.660358 4948 scope.go:117] "RemoveContainer" containerID="506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.669131 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.678829 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.686517 4948 scope.go:117] "RemoveContainer" containerID="c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.691735 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.701251 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.701678 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="proxy-httpd" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.701698 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="proxy-httpd" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.701718 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-api" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.701724 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-api" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.701741 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="ceilometer-notification-agent" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.701747 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="ceilometer-notification-agent" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.701759 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="ceilometer-central-agent" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.701765 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="ceilometer-central-agent" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.701786 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="sg-core" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.701794 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="sg-core" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.701804 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-log" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.701809 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-log" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.702004 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="ceilometer-notification-agent" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.702021 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="sg-core" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.702033 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-log" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.702061 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" containerName="nova-api-api" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.702071 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="ceilometer-central-agent" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.702080 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" containerName="proxy-httpd" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.703052 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.704988 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.705101 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.707658 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.710335 4948 scope.go:117] "RemoveContainer" containerID="f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.712702 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.717970 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.720072 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.723883 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.724264 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.727598 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.736189 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.754236 4948 scope.go:117] "RemoveContainer" containerID="28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795103 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbpv8\" (UniqueName: \"kubernetes.io/projected/a3d57c98-ca9b-4167-984e-8092dc0957c6-kube-api-access-pbpv8\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795227 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795279 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-config-data\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795318 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d57c98-ca9b-4167-984e-8092dc0957c6-logs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795342 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqgqt\" (UniqueName: \"kubernetes.io/projected/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-kube-api-access-nqgqt\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795363 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795391 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-log-httpd\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795420 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795536 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795560 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-scripts\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795582 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795627 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-config-data\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795817 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.795882 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-run-httpd\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.802437 4948 scope.go:117] "RemoveContainer" containerID="506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.802877 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce\": container with ID starting with 506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce not found: ID does not exist" containerID="506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.802920 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce"} err="failed to get container status \"506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce\": rpc error: code = NotFound desc = could not find container \"506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce\": container with ID starting with 506f1d2adeda7f5f334dc4b6241cba96f39442b2a8bbd99af9bb2d98dd7e62ce not found: ID does not exist" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.802948 4948 scope.go:117] "RemoveContainer" containerID="c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.803233 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734\": container with ID starting with c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734 not found: ID does not exist" containerID="c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.803257 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734"} err="failed to get container status \"c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734\": rpc error: code = NotFound desc = could not find container \"c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734\": container with ID starting with c80807ba5cd3acb96de957503a0cf08ca8d1af6af2738d5a6a22290b45bc8734 not found: ID does not exist" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.803269 4948 scope.go:117] "RemoveContainer" containerID="f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.803524 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a\": container with ID starting with f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a not found: ID does not exist" containerID="f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.803543 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a"} err="failed to get container status \"f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a\": rpc error: code = NotFound desc = could not find container \"f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a\": container with ID starting with f34993be8b53b279839cfcb24b47a35ebe5ccabde8e9831c3534feea3efe195a not found: ID does not exist" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.803554 4948 scope.go:117] "RemoveContainer" containerID="28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa" Dec 04 17:59:38 crc kubenswrapper[4948]: E1204 17:59:38.803849 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa\": container with ID starting with 28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa not found: ID does not exist" containerID="28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.803884 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa"} err="failed to get container status \"28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa\": rpc error: code = NotFound desc = could not find container \"28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa\": container with ID starting with 28ccd02d868f867e31d087b386be54bd34263843c9bcdad3064707cdbf3987fa not found: ID does not exist" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.820562 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.844585 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.896867 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897012 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-config-data\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897580 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d57c98-ca9b-4167-984e-8092dc0957c6-logs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897616 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqgqt\" (UniqueName: \"kubernetes.io/projected/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-kube-api-access-nqgqt\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897634 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897722 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-log-httpd\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897793 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897581 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d57c98-ca9b-4167-984e-8092dc0957c6-logs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897855 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897890 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-scripts\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.897929 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.898005 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-config-data\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.898137 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.898181 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-run-httpd\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.898285 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbpv8\" (UniqueName: \"kubernetes.io/projected/a3d57c98-ca9b-4167-984e-8092dc0957c6-kube-api-access-pbpv8\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.898640 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-log-httpd\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.899030 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-run-httpd\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.902180 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.902968 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.902989 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.903757 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.903856 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-scripts\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.903900 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.904537 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.904870 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-config-data\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.906196 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-config-data\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.916335 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbpv8\" (UniqueName: \"kubernetes.io/projected/a3d57c98-ca9b-4167-984e-8092dc0957c6-kube-api-access-pbpv8\") pod \"nova-api-0\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " pod="openstack/nova-api-0" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.923501 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b4c7195-ff07-4569-8768-c39e686596c9" path="/var/lib/kubelet/pods/1b4c7195-ff07-4569-8768-c39e686596c9/volumes" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.924538 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f58d8e7-9c67-4717-bdb8-5efed102c7f9" path="/var/lib/kubelet/pods/3f58d8e7-9c67-4717-bdb8-5efed102c7f9/volumes" Dec 04 17:59:38 crc kubenswrapper[4948]: I1204 17:59:38.925562 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqgqt\" (UniqueName: \"kubernetes.io/projected/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-kube-api-access-nqgqt\") pod \"ceilometer-0\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " pod="openstack/ceilometer-0" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.026816 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.072652 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.559356 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.607595 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3d57c98-ca9b-4167-984e-8092dc0957c6","Type":"ContainerStarted","Data":"754bb11e3a9a400e0aa7db61e5b0c85c54023eb90ce15980782b363b5d22f040"} Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.624993 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vv4h6" event={"ID":"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8","Type":"ContainerStarted","Data":"6da28e26da57b5b687704f2a42b7f467ac3cf4393aaf72df6b7a545c282431e7"} Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.660378 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.702523 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vv4h6" podStartSLOduration=3.089807962 podStartE2EDuration="10.702483716s" podCreationTimestamp="2025-12-04 17:59:29 +0000 UTC" firstStartedPulling="2025-12-04 17:59:31.450529319 +0000 UTC m=+1982.811603711" lastFinishedPulling="2025-12-04 17:59:39.063205063 +0000 UTC m=+1990.424279465" observedRunningTime="2025-12-04 17:59:39.66099603 +0000 UTC m=+1991.022070442" watchObservedRunningTime="2025-12-04 17:59:39.702483716 +0000 UTC m=+1991.063558118" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.826122 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-cml8d"] Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.827435 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.829862 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.830685 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.848009 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-cml8d"] Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.931858 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpqh9\" (UniqueName: \"kubernetes.io/projected/195f5ec9-3622-48de-931e-9205f34910b0-kube-api-access-dpqh9\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.932435 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-scripts\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.932642 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-config-data\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:39 crc kubenswrapper[4948]: I1204 17:59:39.932693 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.033990 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpqh9\" (UniqueName: \"kubernetes.io/projected/195f5ec9-3622-48de-931e-9205f34910b0-kube-api-access-dpqh9\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.034097 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-scripts\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.034199 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-config-data\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.034241 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.039688 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.040798 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-scripts\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.042867 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-config-data\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.053823 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.053886 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.066965 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpqh9\" (UniqueName: \"kubernetes.io/projected/195f5ec9-3622-48de-931e-9205f34910b0-kube-api-access-dpqh9\") pod \"nova-cell1-cell-mapping-cml8d\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.148639 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.347670 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 17:59:40 crc kubenswrapper[4948]: W1204 17:59:40.347827 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82b7f08d_e8bb_4fd7_a3b5_2fa0c94a0d8a.slice/crio-b8bd78e1b1d8408886b743e438e9e50bf8297293c1ac257c51005e12ba3eeccb WatchSource:0}: Error finding container b8bd78e1b1d8408886b743e438e9e50bf8297293c1ac257c51005e12ba3eeccb: Status 404 returned error can't find the container with id b8bd78e1b1d8408886b743e438e9e50bf8297293c1ac257c51005e12ba3eeccb Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.639016 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerStarted","Data":"b8bd78e1b1d8408886b743e438e9e50bf8297293c1ac257c51005e12ba3eeccb"} Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.642217 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3d57c98-ca9b-4167-984e-8092dc0957c6","Type":"ContainerStarted","Data":"2103de98fe18797b0763626316d2a27642fa7c63d098cfac2df6897009bcb31a"} Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.642244 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3d57c98-ca9b-4167-984e-8092dc0957c6","Type":"ContainerStarted","Data":"83b124e001400942829b8cbe8e8d03fc999f59bce7081fe5d6e7736970cd6605"} Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.701320 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.701292422 podStartE2EDuration="2.701292422s" podCreationTimestamp="2025-12-04 17:59:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:40.66275688 +0000 UTC m=+1992.023831282" watchObservedRunningTime="2025-12-04 17:59:40.701292422 +0000 UTC m=+1992.062366824" Dec 04 17:59:40 crc kubenswrapper[4948]: I1204 17:59:40.709321 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-cml8d"] Dec 04 17:59:40 crc kubenswrapper[4948]: W1204 17:59:40.712131 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod195f5ec9_3622_48de_931e_9205f34910b0.slice/crio-91a317d161e842c73d6416db22f4140cba9e2a0b586bd207e2fcfab08cf24ae9 WatchSource:0}: Error finding container 91a317d161e842c73d6416db22f4140cba9e2a0b586bd207e2fcfab08cf24ae9: Status 404 returned error can't find the container with id 91a317d161e842c73d6416db22f4140cba9e2a0b586bd207e2fcfab08cf24ae9 Dec 04 17:59:41 crc kubenswrapper[4948]: I1204 17:59:41.131362 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vv4h6" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="registry-server" probeResult="failure" output=< Dec 04 17:59:41 crc kubenswrapper[4948]: timeout: failed to connect service ":50051" within 1s Dec 04 17:59:41 crc kubenswrapper[4948]: > Dec 04 17:59:41 crc kubenswrapper[4948]: I1204 17:59:41.674817 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerStarted","Data":"62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b"} Dec 04 17:59:41 crc kubenswrapper[4948]: I1204 17:59:41.675621 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerStarted","Data":"cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a"} Dec 04 17:59:41 crc kubenswrapper[4948]: I1204 17:59:41.681940 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cml8d" event={"ID":"195f5ec9-3622-48de-931e-9205f34910b0","Type":"ContainerStarted","Data":"a83c756f93cc2cc4a9dcdfb85cb2483978e21e0e29fa43c794ae8a92f922c6e4"} Dec 04 17:59:41 crc kubenswrapper[4948]: I1204 17:59:41.681979 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cml8d" event={"ID":"195f5ec9-3622-48de-931e-9205f34910b0","Type":"ContainerStarted","Data":"91a317d161e842c73d6416db22f4140cba9e2a0b586bd207e2fcfab08cf24ae9"} Dec 04 17:59:41 crc kubenswrapper[4948]: I1204 17:59:41.707323 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-cml8d" podStartSLOduration=2.707300274 podStartE2EDuration="2.707300274s" podCreationTimestamp="2025-12-04 17:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:41.697207675 +0000 UTC m=+1993.058282077" watchObservedRunningTime="2025-12-04 17:59:41.707300274 +0000 UTC m=+1993.068374676" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.002101 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.104901 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btrns"] Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.111711 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-btrns" podUID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" containerName="dnsmasq-dns" containerID="cri-o://5f19696cdf8f2b7ca40c61e974811c8958a4c4252d9798e3c962a7dc83f23ba1" gracePeriod=10 Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.689415 4948 generic.go:334] "Generic (PLEG): container finished" podID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" containerID="5f19696cdf8f2b7ca40c61e974811c8958a4c4252d9798e3c962a7dc83f23ba1" exitCode=0 Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.689598 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btrns" event={"ID":"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55","Type":"ContainerDied","Data":"5f19696cdf8f2b7ca40c61e974811c8958a4c4252d9798e3c962a7dc83f23ba1"} Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.689716 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-btrns" event={"ID":"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55","Type":"ContainerDied","Data":"31483a4ebc909425e269f84686e6c88a28483fccbee64b2bb77dbf9f9d69bd95"} Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.689728 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31483a4ebc909425e269f84686e6c88a28483fccbee64b2bb77dbf9f9d69bd95" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.690527 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.795004 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-sb\") pod \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.795575 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k64s\" (UniqueName: \"kubernetes.io/projected/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-kube-api-access-5k64s\") pod \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.795629 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-nb\") pod \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.795675 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-svc\") pod \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.795711 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-config\") pod \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.795775 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-swift-storage-0\") pod \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\" (UID: \"69bffb9a-7476-4f8b-a3ab-7e1bce0cba55\") " Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.804237 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-kube-api-access-5k64s" (OuterVolumeSpecName: "kube-api-access-5k64s") pod "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" (UID: "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55"). InnerVolumeSpecName "kube-api-access-5k64s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.870627 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" (UID: "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.875106 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-config" (OuterVolumeSpecName: "config") pod "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" (UID: "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.888431 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" (UID: "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.890721 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" (UID: "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.897918 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.897951 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k64s\" (UniqueName: \"kubernetes.io/projected/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-kube-api-access-5k64s\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.897965 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.897973 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-config\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.897982 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:42 crc kubenswrapper[4948]: I1204 17:59:42.921007 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" (UID: "69bffb9a-7476-4f8b-a3ab-7e1bce0cba55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:59:43 crc kubenswrapper[4948]: I1204 17:59:43.000445 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:43 crc kubenswrapper[4948]: I1204 17:59:43.700651 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-btrns" Dec 04 17:59:43 crc kubenswrapper[4948]: I1204 17:59:43.700641 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerStarted","Data":"a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f"} Dec 04 17:59:43 crc kubenswrapper[4948]: I1204 17:59:43.727138 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btrns"] Dec 04 17:59:43 crc kubenswrapper[4948]: I1204 17:59:43.736548 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-btrns"] Dec 04 17:59:44 crc kubenswrapper[4948]: I1204 17:59:44.712032 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerStarted","Data":"f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54"} Dec 04 17:59:44 crc kubenswrapper[4948]: I1204 17:59:44.712524 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 17:59:44 crc kubenswrapper[4948]: I1204 17:59:44.740561 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6670621839999997 podStartE2EDuration="6.740539255s" podCreationTimestamp="2025-12-04 17:59:38 +0000 UTC" firstStartedPulling="2025-12-04 17:59:40.349586824 +0000 UTC m=+1991.710661226" lastFinishedPulling="2025-12-04 17:59:44.423063895 +0000 UTC m=+1995.784138297" observedRunningTime="2025-12-04 17:59:44.737544129 +0000 UTC m=+1996.098618531" watchObservedRunningTime="2025-12-04 17:59:44.740539255 +0000 UTC m=+1996.101613657" Dec 04 17:59:44 crc kubenswrapper[4948]: I1204 17:59:44.953174 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" path="/var/lib/kubelet/pods/69bffb9a-7476-4f8b-a3ab-7e1bce0cba55/volumes" Dec 04 17:59:45 crc kubenswrapper[4948]: I1204 17:59:45.723361 4948 generic.go:334] "Generic (PLEG): container finished" podID="195f5ec9-3622-48de-931e-9205f34910b0" containerID="a83c756f93cc2cc4a9dcdfb85cb2483978e21e0e29fa43c794ae8a92f922c6e4" exitCode=0 Dec 04 17:59:45 crc kubenswrapper[4948]: I1204 17:59:45.723450 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cml8d" event={"ID":"195f5ec9-3622-48de-931e-9205f34910b0","Type":"ContainerDied","Data":"a83c756f93cc2cc4a9dcdfb85cb2483978e21e0e29fa43c794ae8a92f922c6e4"} Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.082759 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.197335 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-scripts\") pod \"195f5ec9-3622-48de-931e-9205f34910b0\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.197676 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-combined-ca-bundle\") pod \"195f5ec9-3622-48de-931e-9205f34910b0\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.197754 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpqh9\" (UniqueName: \"kubernetes.io/projected/195f5ec9-3622-48de-931e-9205f34910b0-kube-api-access-dpqh9\") pod \"195f5ec9-3622-48de-931e-9205f34910b0\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.197802 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-config-data\") pod \"195f5ec9-3622-48de-931e-9205f34910b0\" (UID: \"195f5ec9-3622-48de-931e-9205f34910b0\") " Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.202650 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-scripts" (OuterVolumeSpecName: "scripts") pod "195f5ec9-3622-48de-931e-9205f34910b0" (UID: "195f5ec9-3622-48de-931e-9205f34910b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.203992 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/195f5ec9-3622-48de-931e-9205f34910b0-kube-api-access-dpqh9" (OuterVolumeSpecName: "kube-api-access-dpqh9") pod "195f5ec9-3622-48de-931e-9205f34910b0" (UID: "195f5ec9-3622-48de-931e-9205f34910b0"). InnerVolumeSpecName "kube-api-access-dpqh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.228308 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "195f5ec9-3622-48de-931e-9205f34910b0" (UID: "195f5ec9-3622-48de-931e-9205f34910b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.230065 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-config-data" (OuterVolumeSpecName: "config-data") pod "195f5ec9-3622-48de-931e-9205f34910b0" (UID: "195f5ec9-3622-48de-931e-9205f34910b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.300447 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpqh9\" (UniqueName: \"kubernetes.io/projected/195f5ec9-3622-48de-931e-9205f34910b0-kube-api-access-dpqh9\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.300492 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.300501 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.300511 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/195f5ec9-3622-48de-931e-9205f34910b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.746520 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-cml8d" event={"ID":"195f5ec9-3622-48de-931e-9205f34910b0","Type":"ContainerDied","Data":"91a317d161e842c73d6416db22f4140cba9e2a0b586bd207e2fcfab08cf24ae9"} Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.746566 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91a317d161e842c73d6416db22f4140cba9e2a0b586bd207e2fcfab08cf24ae9" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.746640 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-cml8d" Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.920252 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.920525 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerName="nova-api-api" containerID="cri-o://2103de98fe18797b0763626316d2a27642fa7c63d098cfac2df6897009bcb31a" gracePeriod=30 Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.920835 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerName="nova-api-log" containerID="cri-o://83b124e001400942829b8cbe8e8d03fc999f59bce7081fe5d6e7736970cd6605" gracePeriod=30 Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.964917 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.965180 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5ef47a31-159f-42c4-a955-b1e833465dd9" containerName="nova-scheduler-scheduler" containerID="cri-o://1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee" gracePeriod=30 Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.991292 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.991583 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-log" containerID="cri-o://3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011" gracePeriod=30 Dec 04 17:59:47 crc kubenswrapper[4948]: I1204 17:59:47.991698 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-metadata" containerID="cri-o://6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408" gracePeriod=30 Dec 04 17:59:48 crc kubenswrapper[4948]: I1204 17:59:48.766068 4948 generic.go:334] "Generic (PLEG): container finished" podID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerID="2103de98fe18797b0763626316d2a27642fa7c63d098cfac2df6897009bcb31a" exitCode=0 Dec 04 17:59:48 crc kubenswrapper[4948]: I1204 17:59:48.766378 4948 generic.go:334] "Generic (PLEG): container finished" podID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerID="83b124e001400942829b8cbe8e8d03fc999f59bce7081fe5d6e7736970cd6605" exitCode=143 Dec 04 17:59:48 crc kubenswrapper[4948]: I1204 17:59:48.766154 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3d57c98-ca9b-4167-984e-8092dc0957c6","Type":"ContainerDied","Data":"2103de98fe18797b0763626316d2a27642fa7c63d098cfac2df6897009bcb31a"} Dec 04 17:59:48 crc kubenswrapper[4948]: I1204 17:59:48.766455 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3d57c98-ca9b-4167-984e-8092dc0957c6","Type":"ContainerDied","Data":"83b124e001400942829b8cbe8e8d03fc999f59bce7081fe5d6e7736970cd6605"} Dec 04 17:59:48 crc kubenswrapper[4948]: I1204 17:59:48.768905 4948 generic.go:334] "Generic (PLEG): container finished" podID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerID="3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011" exitCode=143 Dec 04 17:59:48 crc kubenswrapper[4948]: I1204 17:59:48.768936 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f7e2c0-3aab-406b-9af6-f21c4088ff70","Type":"ContainerDied","Data":"3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011"} Dec 04 17:59:49 crc kubenswrapper[4948]: E1204 17:59:49.566598 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 17:59:49 crc kubenswrapper[4948]: E1204 17:59:49.568289 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 17:59:49 crc kubenswrapper[4948]: E1204 17:59:49.572155 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 17:59:49 crc kubenswrapper[4948]: E1204 17:59:49.572308 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5ef47a31-159f-42c4-a955-b1e833465dd9" containerName="nova-scheduler-scheduler" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.124971 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.190682 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.269661 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vv4h6"] Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.361471 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvf9l"] Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.361716 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mvf9l" podUID="336ada24-a6eb-405e-ac32-f04009852896" containerName="registry-server" containerID="cri-o://683f0c8dd6336e2972c4c9e6013948a40aaf02b4f7702c622d325d2d7f112807" gracePeriod=2 Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.378714 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.460740 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-public-tls-certs\") pod \"a3d57c98-ca9b-4167-984e-8092dc0957c6\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.460897 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbpv8\" (UniqueName: \"kubernetes.io/projected/a3d57c98-ca9b-4167-984e-8092dc0957c6-kube-api-access-pbpv8\") pod \"a3d57c98-ca9b-4167-984e-8092dc0957c6\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.460936 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d57c98-ca9b-4167-984e-8092dc0957c6-logs\") pod \"a3d57c98-ca9b-4167-984e-8092dc0957c6\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.460976 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-internal-tls-certs\") pod \"a3d57c98-ca9b-4167-984e-8092dc0957c6\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.461014 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-combined-ca-bundle\") pod \"a3d57c98-ca9b-4167-984e-8092dc0957c6\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.461086 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-config-data\") pod \"a3d57c98-ca9b-4167-984e-8092dc0957c6\" (UID: \"a3d57c98-ca9b-4167-984e-8092dc0957c6\") " Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.461706 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3d57c98-ca9b-4167-984e-8092dc0957c6-logs" (OuterVolumeSpecName: "logs") pod "a3d57c98-ca9b-4167-984e-8092dc0957c6" (UID: "a3d57c98-ca9b-4167-984e-8092dc0957c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.470938 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d57c98-ca9b-4167-984e-8092dc0957c6-kube-api-access-pbpv8" (OuterVolumeSpecName: "kube-api-access-pbpv8") pod "a3d57c98-ca9b-4167-984e-8092dc0957c6" (UID: "a3d57c98-ca9b-4167-984e-8092dc0957c6"). InnerVolumeSpecName "kube-api-access-pbpv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.490795 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3d57c98-ca9b-4167-984e-8092dc0957c6" (UID: "a3d57c98-ca9b-4167-984e-8092dc0957c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.498975 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-config-data" (OuterVolumeSpecName: "config-data") pod "a3d57c98-ca9b-4167-984e-8092dc0957c6" (UID: "a3d57c98-ca9b-4167-984e-8092dc0957c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.510878 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a3d57c98-ca9b-4167-984e-8092dc0957c6" (UID: "a3d57c98-ca9b-4167-984e-8092dc0957c6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.514592 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a3d57c98-ca9b-4167-984e-8092dc0957c6" (UID: "a3d57c98-ca9b-4167-984e-8092dc0957c6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.563381 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbpv8\" (UniqueName: \"kubernetes.io/projected/a3d57c98-ca9b-4167-984e-8092dc0957c6-kube-api-access-pbpv8\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.563419 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d57c98-ca9b-4167-984e-8092dc0957c6-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.563434 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.563445 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.563458 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.563553 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d57c98-ca9b-4167-984e-8092dc0957c6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.792411 4948 generic.go:334] "Generic (PLEG): container finished" podID="336ada24-a6eb-405e-ac32-f04009852896" containerID="683f0c8dd6336e2972c4c9e6013948a40aaf02b4f7702c622d325d2d7f112807" exitCode=0 Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.792493 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvf9l" event={"ID":"336ada24-a6eb-405e-ac32-f04009852896","Type":"ContainerDied","Data":"683f0c8dd6336e2972c4c9e6013948a40aaf02b4f7702c622d325d2d7f112807"} Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.795569 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.796409 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3d57c98-ca9b-4167-984e-8092dc0957c6","Type":"ContainerDied","Data":"754bb11e3a9a400e0aa7db61e5b0c85c54023eb90ce15980782b363b5d22f040"} Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.796494 4948 scope.go:117] "RemoveContainer" containerID="2103de98fe18797b0763626316d2a27642fa7c63d098cfac2df6897009bcb31a" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.838439 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.844367 4948 scope.go:117] "RemoveContainer" containerID="83b124e001400942829b8cbe8e8d03fc999f59bce7081fe5d6e7736970cd6605" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.852507 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.871240 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:50 crc kubenswrapper[4948]: E1204 17:59:50.871673 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" containerName="init" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.871692 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" containerName="init" Dec 04 17:59:50 crc kubenswrapper[4948]: E1204 17:59:50.871719 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" containerName="dnsmasq-dns" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.871728 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" containerName="dnsmasq-dns" Dec 04 17:59:50 crc kubenswrapper[4948]: E1204 17:59:50.871747 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="195f5ec9-3622-48de-931e-9205f34910b0" containerName="nova-manage" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.871755 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="195f5ec9-3622-48de-931e-9205f34910b0" containerName="nova-manage" Dec 04 17:59:50 crc kubenswrapper[4948]: E1204 17:59:50.871776 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerName="nova-api-log" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.871782 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerName="nova-api-log" Dec 04 17:59:50 crc kubenswrapper[4948]: E1204 17:59:50.871790 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerName="nova-api-api" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.871796 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerName="nova-api-api" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.871984 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerName="nova-api-api" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.872012 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" containerName="nova-api-log" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.872028 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="195f5ec9-3622-48de-931e-9205f34910b0" containerName="nova-manage" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.872058 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bffb9a-7476-4f8b-a3ab-7e1bce0cba55" containerName="dnsmasq-dns" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.873068 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.876380 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.876520 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.876661 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.888998 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.926501 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d57c98-ca9b-4167-984e-8092dc0957c6" path="/var/lib/kubelet/pods/a3d57c98-ca9b-4167-984e-8092dc0957c6/volumes" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.970189 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.970438 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brx8x\" (UniqueName: \"kubernetes.io/projected/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-kube-api-access-brx8x\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.970462 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.970501 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.970780 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-config-data\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:50 crc kubenswrapper[4948]: I1204 17:59:50.970969 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-logs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.072884 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.072927 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brx8x\" (UniqueName: \"kubernetes.io/projected/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-kube-api-access-brx8x\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.072955 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.072999 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.073084 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-config-data\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.073153 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-logs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.073658 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-logs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.077561 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-config-data\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.079570 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.079953 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.080365 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-public-tls-certs\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.087608 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.092511 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brx8x\" (UniqueName: \"kubernetes.io/projected/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-kube-api-access-brx8x\") pod \"nova-api-0\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.126334 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:33442->10.217.0.191:8775: read: connection reset by peer" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.126382 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:33434->10.217.0.191:8775: read: connection reset by peer" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.174769 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stjmg\" (UniqueName: \"kubernetes.io/projected/336ada24-a6eb-405e-ac32-f04009852896-kube-api-access-stjmg\") pod \"336ada24-a6eb-405e-ac32-f04009852896\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.175144 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-utilities\") pod \"336ada24-a6eb-405e-ac32-f04009852896\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.175285 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-catalog-content\") pod \"336ada24-a6eb-405e-ac32-f04009852896\" (UID: \"336ada24-a6eb-405e-ac32-f04009852896\") " Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.175863 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-utilities" (OuterVolumeSpecName: "utilities") pod "336ada24-a6eb-405e-ac32-f04009852896" (UID: "336ada24-a6eb-405e-ac32-f04009852896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.180075 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336ada24-a6eb-405e-ac32-f04009852896-kube-api-access-stjmg" (OuterVolumeSpecName: "kube-api-access-stjmg") pod "336ada24-a6eb-405e-ac32-f04009852896" (UID: "336ada24-a6eb-405e-ac32-f04009852896"). InnerVolumeSpecName "kube-api-access-stjmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.200553 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.222287 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "336ada24-a6eb-405e-ac32-f04009852896" (UID: "336ada24-a6eb-405e-ac32-f04009852896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.277363 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.277397 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stjmg\" (UniqueName: \"kubernetes.io/projected/336ada24-a6eb-405e-ac32-f04009852896-kube-api-access-stjmg\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.277409 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/336ada24-a6eb-405e-ac32-f04009852896-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.687686 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.797458 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf4zn\" (UniqueName: \"kubernetes.io/projected/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-kube-api-access-kf4zn\") pod \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.797534 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-logs\") pod \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.797671 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-config-data\") pod \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.797722 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-nova-metadata-tls-certs\") pod \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.797753 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-combined-ca-bundle\") pod \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\" (UID: \"f1f7e2c0-3aab-406b-9af6-f21c4088ff70\") " Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.801901 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-logs" (OuterVolumeSpecName: "logs") pod "f1f7e2c0-3aab-406b-9af6-f21c4088ff70" (UID: "f1f7e2c0-3aab-406b-9af6-f21c4088ff70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.810470 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-kube-api-access-kf4zn" (OuterVolumeSpecName: "kube-api-access-kf4zn") pod "f1f7e2c0-3aab-406b-9af6-f21c4088ff70" (UID: "f1f7e2c0-3aab-406b-9af6-f21c4088ff70"). InnerVolumeSpecName "kube-api-access-kf4zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.820778 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.827384 4948 generic.go:334] "Generic (PLEG): container finished" podID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerID="6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408" exitCode=0 Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.827510 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f7e2c0-3aab-406b-9af6-f21c4088ff70","Type":"ContainerDied","Data":"6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408"} Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.827599 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f1f7e2c0-3aab-406b-9af6-f21c4088ff70","Type":"ContainerDied","Data":"474676169dfa1849ffb69acb3ca33a7dbef8fcd56107267af6df5d604d44abf4"} Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.827701 4948 scope.go:117] "RemoveContainer" containerID="6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.827875 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.842222 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mvf9l" event={"ID":"336ada24-a6eb-405e-ac32-f04009852896","Type":"ContainerDied","Data":"3442db8d50667a3aa23acd0a1925e43754acfb2fad24f2144b9ef37d80de6140"} Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.843159 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mvf9l" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.871973 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f1f7e2c0-3aab-406b-9af6-f21c4088ff70" (UID: "f1f7e2c0-3aab-406b-9af6-f21c4088ff70"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.887890 4948 scope.go:117] "RemoveContainer" containerID="3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.888458 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-config-data" (OuterVolumeSpecName: "config-data") pod "f1f7e2c0-3aab-406b-9af6-f21c4088ff70" (UID: "f1f7e2c0-3aab-406b-9af6-f21c4088ff70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.898324 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1f7e2c0-3aab-406b-9af6-f21c4088ff70" (UID: "f1f7e2c0-3aab-406b-9af6-f21c4088ff70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.899649 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf4zn\" (UniqueName: \"kubernetes.io/projected/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-kube-api-access-kf4zn\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.899684 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-logs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.899696 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.899707 4948 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.899718 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f7e2c0-3aab-406b-9af6-f21c4088ff70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.910369 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mvf9l"] Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.917704 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mvf9l"] Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.927287 4948 scope.go:117] "RemoveContainer" containerID="6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408" Dec 04 17:59:51 crc kubenswrapper[4948]: E1204 17:59:51.927643 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408\": container with ID starting with 6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408 not found: ID does not exist" containerID="6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.927730 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408"} err="failed to get container status \"6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408\": rpc error: code = NotFound desc = could not find container \"6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408\": container with ID starting with 6046dfc00102b26b06fbf34dd51f02f144629dd0aab5d21b5d6156f3eccfa408 not found: ID does not exist" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.927829 4948 scope.go:117] "RemoveContainer" containerID="3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011" Dec 04 17:59:51 crc kubenswrapper[4948]: E1204 17:59:51.928272 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011\": container with ID starting with 3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011 not found: ID does not exist" containerID="3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.928352 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011"} err="failed to get container status \"3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011\": rpc error: code = NotFound desc = could not find container \"3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011\": container with ID starting with 3d12b511fe88da334f88380887acc9bce9e5eda0c4fb1d174c21d21d9f66d011 not found: ID does not exist" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.928416 4948 scope.go:117] "RemoveContainer" containerID="683f0c8dd6336e2972c4c9e6013948a40aaf02b4f7702c622d325d2d7f112807" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.963238 4948 scope.go:117] "RemoveContainer" containerID="1c4845ecaa986618e879bf5def3f055aed9f843006b36b7e6703ef5863133124" Dec 04 17:59:51 crc kubenswrapper[4948]: I1204 17:59:51.998913 4948 scope.go:117] "RemoveContainer" containerID="82bd6d5966c0c1081c9e1c0191f9efbb579e64ad9b3814df1220900b64c1b34a" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.168426 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.182606 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208121 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:52 crc kubenswrapper[4948]: E1204 17:59:52.208574 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336ada24-a6eb-405e-ac32-f04009852896" containerName="extract-content" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208595 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="336ada24-a6eb-405e-ac32-f04009852896" containerName="extract-content" Dec 04 17:59:52 crc kubenswrapper[4948]: E1204 17:59:52.208616 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-log" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208624 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-log" Dec 04 17:59:52 crc kubenswrapper[4948]: E1204 17:59:52.208637 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-metadata" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208645 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-metadata" Dec 04 17:59:52 crc kubenswrapper[4948]: E1204 17:59:52.208657 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336ada24-a6eb-405e-ac32-f04009852896" containerName="registry-server" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208664 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="336ada24-a6eb-405e-ac32-f04009852896" containerName="registry-server" Dec 04 17:59:52 crc kubenswrapper[4948]: E1204 17:59:52.208677 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336ada24-a6eb-405e-ac32-f04009852896" containerName="extract-utilities" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208685 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="336ada24-a6eb-405e-ac32-f04009852896" containerName="extract-utilities" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208911 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-log" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208929 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="336ada24-a6eb-405e-ac32-f04009852896" containerName="registry-server" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.208949 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" containerName="nova-metadata-metadata" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.210152 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.214122 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.214321 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.248068 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.305582 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b408db-1dec-49e0-8212-1193d4fe6a37-logs\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.305700 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.305842 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbnc\" (UniqueName: \"kubernetes.io/projected/60b408db-1dec-49e0-8212-1193d4fe6a37-kube-api-access-flbnc\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.305967 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-config-data\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.306029 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.408305 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-config-data\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.408421 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.408509 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b408db-1dec-49e0-8212-1193d4fe6a37-logs\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.408581 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.408624 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flbnc\" (UniqueName: \"kubernetes.io/projected/60b408db-1dec-49e0-8212-1193d4fe6a37-kube-api-access-flbnc\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.408938 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b408db-1dec-49e0-8212-1193d4fe6a37-logs\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.413596 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.415616 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-config-data\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.417532 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.430730 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbnc\" (UniqueName: \"kubernetes.io/projected/60b408db-1dec-49e0-8212-1193d4fe6a37-kube-api-access-flbnc\") pod \"nova-metadata-0\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.613252 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.854804 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3","Type":"ContainerStarted","Data":"12e734767396eb518b40a349afac5356ca256b1870d63b32a4acf4a594db67b0"} Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.855184 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3","Type":"ContainerStarted","Data":"7ac6688f04690776901f6c203c84667f87067c95bbc8e52dbe5b9d9106e8a071"} Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.855201 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3","Type":"ContainerStarted","Data":"465e42b51728405a405c62806bb707352238117837983114b872ee8862b54dfe"} Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.897029 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.897005561 podStartE2EDuration="2.897005561s" podCreationTimestamp="2025-12-04 17:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:52.882028373 +0000 UTC m=+2004.243102775" watchObservedRunningTime="2025-12-04 17:59:52.897005561 +0000 UTC m=+2004.258079963" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.929504 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336ada24-a6eb-405e-ac32-f04009852896" path="/var/lib/kubelet/pods/336ada24-a6eb-405e-ac32-f04009852896/volumes" Dec 04 17:59:52 crc kubenswrapper[4948]: I1204 17:59:52.930329 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f7e2c0-3aab-406b-9af6-f21c4088ff70" path="/var/lib/kubelet/pods/f1f7e2c0-3aab-406b-9af6-f21c4088ff70/volumes" Dec 04 17:59:53 crc kubenswrapper[4948]: W1204 17:59:53.106198 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60b408db_1dec_49e0_8212_1193d4fe6a37.slice/crio-933ef6eec024c35e2fa3dcfa2ae62aafc963a1c5d046edbe6567a3b945c72a75 WatchSource:0}: Error finding container 933ef6eec024c35e2fa3dcfa2ae62aafc963a1c5d046edbe6567a3b945c72a75: Status 404 returned error can't find the container with id 933ef6eec024c35e2fa3dcfa2ae62aafc963a1c5d046edbe6567a3b945c72a75 Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.109752 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.570015 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.636118 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlf7b\" (UniqueName: \"kubernetes.io/projected/5ef47a31-159f-42c4-a955-b1e833465dd9-kube-api-access-wlf7b\") pod \"5ef47a31-159f-42c4-a955-b1e833465dd9\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.636369 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-combined-ca-bundle\") pod \"5ef47a31-159f-42c4-a955-b1e833465dd9\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.636871 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-config-data\") pod \"5ef47a31-159f-42c4-a955-b1e833465dd9\" (UID: \"5ef47a31-159f-42c4-a955-b1e833465dd9\") " Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.639915 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef47a31-159f-42c4-a955-b1e833465dd9-kube-api-access-wlf7b" (OuterVolumeSpecName: "kube-api-access-wlf7b") pod "5ef47a31-159f-42c4-a955-b1e833465dd9" (UID: "5ef47a31-159f-42c4-a955-b1e833465dd9"). InnerVolumeSpecName "kube-api-access-wlf7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.676938 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ef47a31-159f-42c4-a955-b1e833465dd9" (UID: "5ef47a31-159f-42c4-a955-b1e833465dd9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.678386 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-config-data" (OuterVolumeSpecName: "config-data") pod "5ef47a31-159f-42c4-a955-b1e833465dd9" (UID: "5ef47a31-159f-42c4-a955-b1e833465dd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.739579 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.739921 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlf7b\" (UniqueName: \"kubernetes.io/projected/5ef47a31-159f-42c4-a955-b1e833465dd9-kube-api-access-wlf7b\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.739940 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef47a31-159f-42c4-a955-b1e833465dd9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.867918 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60b408db-1dec-49e0-8212-1193d4fe6a37","Type":"ContainerStarted","Data":"f3c7b7339517046484e3d5e33d506a76290c1be3ff41874ab17e7a9348fa892a"} Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.867969 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60b408db-1dec-49e0-8212-1193d4fe6a37","Type":"ContainerStarted","Data":"87d492aabd3820482e4066aa4ce2c353d0f6250c208d12c75d4f38c935248ea8"} Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.867980 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60b408db-1dec-49e0-8212-1193d4fe6a37","Type":"ContainerStarted","Data":"933ef6eec024c35e2fa3dcfa2ae62aafc963a1c5d046edbe6567a3b945c72a75"} Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.870326 4948 generic.go:334] "Generic (PLEG): container finished" podID="5ef47a31-159f-42c4-a955-b1e833465dd9" containerID="1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee" exitCode=0 Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.870834 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.879126 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5ef47a31-159f-42c4-a955-b1e833465dd9","Type":"ContainerDied","Data":"1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee"} Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.879169 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5ef47a31-159f-42c4-a955-b1e833465dd9","Type":"ContainerDied","Data":"7f3cd23df1554fcb3d36abfc756cd44622514399510938dcd440e19cfc2d9f94"} Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.879187 4948 scope.go:117] "RemoveContainer" containerID="1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.901395 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.901372327 podStartE2EDuration="1.901372327s" podCreationTimestamp="2025-12-04 17:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:53.892453932 +0000 UTC m=+2005.253528334" watchObservedRunningTime="2025-12-04 17:59:53.901372327 +0000 UTC m=+2005.262446729" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.917144 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.920830 4948 scope.go:117] "RemoveContainer" containerID="1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee" Dec 04 17:59:53 crc kubenswrapper[4948]: E1204 17:59:53.921538 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee\": container with ID starting with 1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee not found: ID does not exist" containerID="1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.921576 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee"} err="failed to get container status \"1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee\": rpc error: code = NotFound desc = could not find container \"1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee\": container with ID starting with 1649f5271b17cde9e884cfb8e2ca0d836947d18f8f299a369d446d119204bdee not found: ID does not exist" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.947478 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.972947 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:53 crc kubenswrapper[4948]: E1204 17:59:53.973667 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef47a31-159f-42c4-a955-b1e833465dd9" containerName="nova-scheduler-scheduler" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.973758 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef47a31-159f-42c4-a955-b1e833465dd9" containerName="nova-scheduler-scheduler" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.974033 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef47a31-159f-42c4-a955-b1e833465dd9" containerName="nova-scheduler-scheduler" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.974676 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.983100 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 17:59:53 crc kubenswrapper[4948]: I1204 17:59:53.994909 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.047202 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwdn\" (UniqueName: \"kubernetes.io/projected/bbda827a-8528-4b7f-8d4c-70fe8be65d27-kube-api-access-nxwdn\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.047355 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.047463 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-config-data\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.149715 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.149759 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-config-data\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.149879 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwdn\" (UniqueName: \"kubernetes.io/projected/bbda827a-8528-4b7f-8d4c-70fe8be65d27-kube-api-access-nxwdn\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.153780 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.155504 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-config-data\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.165938 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwdn\" (UniqueName: \"kubernetes.io/projected/bbda827a-8528-4b7f-8d4c-70fe8be65d27-kube-api-access-nxwdn\") pod \"nova-scheduler-0\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.302017 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.747893 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 17:59:54 crc kubenswrapper[4948]: W1204 17:59:54.753843 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbda827a_8528_4b7f_8d4c_70fe8be65d27.slice/crio-ceb5d1f3057c688bb7eb89af8f4cba0cd723443d1a2971f88caac2f38d93f692 WatchSource:0}: Error finding container ceb5d1f3057c688bb7eb89af8f4cba0cd723443d1a2971f88caac2f38d93f692: Status 404 returned error can't find the container with id ceb5d1f3057c688bb7eb89af8f4cba0cd723443d1a2971f88caac2f38d93f692 Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.880873 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbda827a-8528-4b7f-8d4c-70fe8be65d27","Type":"ContainerStarted","Data":"ceb5d1f3057c688bb7eb89af8f4cba0cd723443d1a2971f88caac2f38d93f692"} Dec 04 17:59:54 crc kubenswrapper[4948]: I1204 17:59:54.926559 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef47a31-159f-42c4-a955-b1e833465dd9" path="/var/lib/kubelet/pods/5ef47a31-159f-42c4-a955-b1e833465dd9/volumes" Dec 04 17:59:55 crc kubenswrapper[4948]: I1204 17:59:55.893641 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbda827a-8528-4b7f-8d4c-70fe8be65d27","Type":"ContainerStarted","Data":"72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271"} Dec 04 17:59:57 crc kubenswrapper[4948]: I1204 17:59:57.613341 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 17:59:57 crc kubenswrapper[4948]: I1204 17:59:57.613582 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.302313 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.434451 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=6.434420922 podStartE2EDuration="6.434420922s" podCreationTimestamp="2025-12-04 17:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:59:55.914846292 +0000 UTC m=+2007.275920734" watchObservedRunningTime="2025-12-04 17:59:59.434420922 +0000 UTC m=+2010.795495364" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.437556 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8js"] Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.440121 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.451004 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8js"] Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.550062 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z67n4\" (UniqueName: \"kubernetes.io/projected/00a4c977-360c-4133-b59e-4c3e47319ad0-kube-api-access-z67n4\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.550114 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-catalog-content\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.550403 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-utilities\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.652947 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z67n4\" (UniqueName: \"kubernetes.io/projected/00a4c977-360c-4133-b59e-4c3e47319ad0-kube-api-access-z67n4\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.653068 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-catalog-content\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.653187 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-utilities\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.653902 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-catalog-content\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.654117 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-utilities\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.678705 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z67n4\" (UniqueName: \"kubernetes.io/projected/00a4c977-360c-4133-b59e-4c3e47319ad0-kube-api-access-z67n4\") pod \"redhat-marketplace-bw8js\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 17:59:59 crc kubenswrapper[4948]: I1204 17:59:59.812883 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.136953 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4"] Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.138899 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.141374 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.141389 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.150320 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4"] Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.265966 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-config-volume\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.266382 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-secret-volume\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.266625 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng6nd\" (UniqueName: \"kubernetes.io/projected/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-kube-api-access-ng6nd\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.285089 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8js"] Dec 04 18:00:00 crc kubenswrapper[4948]: W1204 18:00:00.285161 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a4c977_360c_4133_b59e_4c3e47319ad0.slice/crio-5d462f15ce19d9c4a6331943a60c5b6d668b05a5b5a1cc00ff8c5cb28224b13a WatchSource:0}: Error finding container 5d462f15ce19d9c4a6331943a60c5b6d668b05a5b5a1cc00ff8c5cb28224b13a: Status 404 returned error can't find the container with id 5d462f15ce19d9c4a6331943a60c5b6d668b05a5b5a1cc00ff8c5cb28224b13a Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.368937 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-config-volume\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.369033 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-secret-volume\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.369166 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng6nd\" (UniqueName: \"kubernetes.io/projected/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-kube-api-access-ng6nd\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.369815 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-config-volume\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.374904 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-secret-volume\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.387095 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng6nd\" (UniqueName: \"kubernetes.io/projected/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-kube-api-access-ng6nd\") pod \"collect-profiles-29414520-tzhn4\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.458604 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.879953 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4"] Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.949772 4948 generic.go:334] "Generic (PLEG): container finished" podID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerID="5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3" exitCode=0 Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.949840 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8js" event={"ID":"00a4c977-360c-4133-b59e-4c3e47319ad0","Type":"ContainerDied","Data":"5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3"} Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.949867 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8js" event={"ID":"00a4c977-360c-4133-b59e-4c3e47319ad0","Type":"ContainerStarted","Data":"5d462f15ce19d9c4a6331943a60c5b6d668b05a5b5a1cc00ff8c5cb28224b13a"} Dec 04 18:00:00 crc kubenswrapper[4948]: I1204 18:00:00.951643 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" event={"ID":"8bf95293-dc43-49fc-8f3f-259c3b5d2a11","Type":"ContainerStarted","Data":"04da8f057eebdc285ce62bd01161b62744f32345de7761d618a9406a421f1258"} Dec 04 18:00:01 crc kubenswrapper[4948]: I1204 18:00:01.200965 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 18:00:01 crc kubenswrapper[4948]: I1204 18:00:01.201372 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 18:00:01 crc kubenswrapper[4948]: I1204 18:00:01.967944 4948 generic.go:334] "Generic (PLEG): container finished" podID="8bf95293-dc43-49fc-8f3f-259c3b5d2a11" containerID="1849f5c57638f83dd1f1562122e4836ea898c2ee6e3703c5f4eb47622c594734" exitCode=0 Dec 04 18:00:01 crc kubenswrapper[4948]: I1204 18:00:01.967982 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" event={"ID":"8bf95293-dc43-49fc-8f3f-259c3b5d2a11","Type":"ContainerDied","Data":"1849f5c57638f83dd1f1562122e4836ea898c2ee6e3703c5f4eb47622c594734"} Dec 04 18:00:02 crc kubenswrapper[4948]: I1204 18:00:02.215194 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 18:00:02 crc kubenswrapper[4948]: I1204 18:00:02.215222 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 18:00:02 crc kubenswrapper[4948]: I1204 18:00:02.614320 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 18:00:02 crc kubenswrapper[4948]: I1204 18:00:02.614387 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.424792 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.461172 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng6nd\" (UniqueName: \"kubernetes.io/projected/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-kube-api-access-ng6nd\") pod \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.461240 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-config-volume\") pod \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.461398 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-secret-volume\") pod \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\" (UID: \"8bf95293-dc43-49fc-8f3f-259c3b5d2a11\") " Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.462171 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-config-volume" (OuterVolumeSpecName: "config-volume") pod "8bf95293-dc43-49fc-8f3f-259c3b5d2a11" (UID: "8bf95293-dc43-49fc-8f3f-259c3b5d2a11"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.467532 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-kube-api-access-ng6nd" (OuterVolumeSpecName: "kube-api-access-ng6nd") pod "8bf95293-dc43-49fc-8f3f-259c3b5d2a11" (UID: "8bf95293-dc43-49fc-8f3f-259c3b5d2a11"). InnerVolumeSpecName "kube-api-access-ng6nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.470150 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8bf95293-dc43-49fc-8f3f-259c3b5d2a11" (UID: "8bf95293-dc43-49fc-8f3f-259c3b5d2a11"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.563675 4948 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.563710 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng6nd\" (UniqueName: \"kubernetes.io/projected/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-kube-api-access-ng6nd\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.563720 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8bf95293-dc43-49fc-8f3f-259c3b5d2a11-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.629209 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.629258 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.988611 4948 generic.go:334] "Generic (PLEG): container finished" podID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerID="62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb" exitCode=0 Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.988699 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8js" event={"ID":"00a4c977-360c-4133-b59e-4c3e47319ad0","Type":"ContainerDied","Data":"62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb"} Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.994880 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" event={"ID":"8bf95293-dc43-49fc-8f3f-259c3b5d2a11","Type":"ContainerDied","Data":"04da8f057eebdc285ce62bd01161b62744f32345de7761d618a9406a421f1258"} Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.994917 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04da8f057eebdc285ce62bd01161b62744f32345de7761d618a9406a421f1258" Dec 04 18:00:03 crc kubenswrapper[4948]: I1204 18:00:03.994967 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4" Dec 04 18:00:04 crc kubenswrapper[4948]: I1204 18:00:04.302709 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 18:00:04 crc kubenswrapper[4948]: I1204 18:00:04.339445 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 18:00:04 crc kubenswrapper[4948]: I1204 18:00:04.526638 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf"] Dec 04 18:00:04 crc kubenswrapper[4948]: I1204 18:00:04.535589 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414475-l8jtf"] Dec 04 18:00:04 crc kubenswrapper[4948]: I1204 18:00:04.926899 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702984bc-83a3-4da1-bd02-f8879e78502d" path="/var/lib/kubelet/pods/702984bc-83a3-4da1-bd02-f8879e78502d/volumes" Dec 04 18:00:05 crc kubenswrapper[4948]: I1204 18:00:05.010001 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8js" event={"ID":"00a4c977-360c-4133-b59e-4c3e47319ad0","Type":"ContainerStarted","Data":"5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf"} Dec 04 18:00:05 crc kubenswrapper[4948]: I1204 18:00:05.033804 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bw8js" podStartSLOduration=2.618266911 podStartE2EDuration="6.033780299s" podCreationTimestamp="2025-12-04 17:59:59 +0000 UTC" firstStartedPulling="2025-12-04 18:00:00.952429676 +0000 UTC m=+2012.313504078" lastFinishedPulling="2025-12-04 18:00:04.367943074 +0000 UTC m=+2015.729017466" observedRunningTime="2025-12-04 18:00:05.028784298 +0000 UTC m=+2016.389858710" watchObservedRunningTime="2025-12-04 18:00:05.033780299 +0000 UTC m=+2016.394854701" Dec 04 18:00:05 crc kubenswrapper[4948]: I1204 18:00:05.037748 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 18:00:09 crc kubenswrapper[4948]: I1204 18:00:09.087247 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 18:00:09 crc kubenswrapper[4948]: I1204 18:00:09.813293 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 18:00:09 crc kubenswrapper[4948]: I1204 18:00:09.813560 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 18:00:09 crc kubenswrapper[4948]: I1204 18:00:09.877758 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 18:00:10 crc kubenswrapper[4948]: I1204 18:00:10.141722 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 18:00:10 crc kubenswrapper[4948]: I1204 18:00:10.194329 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8js"] Dec 04 18:00:10 crc kubenswrapper[4948]: I1204 18:00:10.625252 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:00:10 crc kubenswrapper[4948]: I1204 18:00:10.625310 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:00:11 crc kubenswrapper[4948]: I1204 18:00:11.209156 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 18:00:11 crc kubenswrapper[4948]: I1204 18:00:11.210929 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 18:00:11 crc kubenswrapper[4948]: I1204 18:00:11.211940 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 18:00:11 crc kubenswrapper[4948]: I1204 18:00:11.219005 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.090873 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.091333 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bw8js" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerName="registry-server" containerID="cri-o://5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf" gracePeriod=2 Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.097470 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.602822 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.626681 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.637707 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.643373 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.662071 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z67n4\" (UniqueName: \"kubernetes.io/projected/00a4c977-360c-4133-b59e-4c3e47319ad0-kube-api-access-z67n4\") pod \"00a4c977-360c-4133-b59e-4c3e47319ad0\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.662175 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-utilities\") pod \"00a4c977-360c-4133-b59e-4c3e47319ad0\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.662412 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-catalog-content\") pod \"00a4c977-360c-4133-b59e-4c3e47319ad0\" (UID: \"00a4c977-360c-4133-b59e-4c3e47319ad0\") " Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.664095 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-utilities" (OuterVolumeSpecName: "utilities") pod "00a4c977-360c-4133-b59e-4c3e47319ad0" (UID: "00a4c977-360c-4133-b59e-4c3e47319ad0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.670272 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a4c977-360c-4133-b59e-4c3e47319ad0-kube-api-access-z67n4" (OuterVolumeSpecName: "kube-api-access-z67n4") pod "00a4c977-360c-4133-b59e-4c3e47319ad0" (UID: "00a4c977-360c-4133-b59e-4c3e47319ad0"). InnerVolumeSpecName "kube-api-access-z67n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.689175 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00a4c977-360c-4133-b59e-4c3e47319ad0" (UID: "00a4c977-360c-4133-b59e-4c3e47319ad0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.764506 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.764537 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z67n4\" (UniqueName: \"kubernetes.io/projected/00a4c977-360c-4133-b59e-4c3e47319ad0-kube-api-access-z67n4\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:12 crc kubenswrapper[4948]: I1204 18:00:12.764550 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4c977-360c-4133-b59e-4c3e47319ad0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.102118 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bw8js" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.102143 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8js" event={"ID":"00a4c977-360c-4133-b59e-4c3e47319ad0","Type":"ContainerDied","Data":"5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf"} Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.102187 4948 scope.go:117] "RemoveContainer" containerID="5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.102116 4948 generic.go:334] "Generic (PLEG): container finished" podID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerID="5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf" exitCode=0 Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.102293 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bw8js" event={"ID":"00a4c977-360c-4133-b59e-4c3e47319ad0","Type":"ContainerDied","Data":"5d462f15ce19d9c4a6331943a60c5b6d668b05a5b5a1cc00ff8c5cb28224b13a"} Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.109548 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.131675 4948 scope.go:117] "RemoveContainer" containerID="62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.131747 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8js"] Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.146392 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bw8js"] Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.161875 4948 scope.go:117] "RemoveContainer" containerID="5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.216144 4948 scope.go:117] "RemoveContainer" containerID="5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf" Dec 04 18:00:13 crc kubenswrapper[4948]: E1204 18:00:13.217063 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf\": container with ID starting with 5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf not found: ID does not exist" containerID="5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.217092 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf"} err="failed to get container status \"5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf\": rpc error: code = NotFound desc = could not find container \"5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf\": container with ID starting with 5ba7efa4af9a6b6de5429374ee6363e88110951ebc3a1fe4a29d92ab9e960dbf not found: ID does not exist" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.217114 4948 scope.go:117] "RemoveContainer" containerID="62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb" Dec 04 18:00:13 crc kubenswrapper[4948]: E1204 18:00:13.217363 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb\": container with ID starting with 62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb not found: ID does not exist" containerID="62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.217383 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb"} err="failed to get container status \"62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb\": rpc error: code = NotFound desc = could not find container \"62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb\": container with ID starting with 62b5ce928239f812384179c97fe03ab3fa58c97125a61b59ea8470fec614f0eb not found: ID does not exist" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.217397 4948 scope.go:117] "RemoveContainer" containerID="5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3" Dec 04 18:00:13 crc kubenswrapper[4948]: E1204 18:00:13.217685 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3\": container with ID starting with 5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3 not found: ID does not exist" containerID="5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3" Dec 04 18:00:13 crc kubenswrapper[4948]: I1204 18:00:13.217701 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3"} err="failed to get container status \"5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3\": rpc error: code = NotFound desc = could not find container \"5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3\": container with ID starting with 5020b6423d08a364e8ea15a3dac965d41efe3a8b132b416ad5f7de9a707752c3 not found: ID does not exist" Dec 04 18:00:14 crc kubenswrapper[4948]: I1204 18:00:14.924987 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" path="/var/lib/kubelet/pods/00a4c977-360c-4133-b59e-4c3e47319ad0/volumes" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.730960 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.731729 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="9c0787d1-2fd6-4c5c-8e07-44bcbab37320" containerName="openstackclient" containerID="cri-o://d9a6ef11482121f4a842966dc02f23660d77b8786023f6691bf3ecec31db0c0c" gracePeriod=2 Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.754697 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.850655 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron4bb1-account-delete-4fsjg"] Dec 04 18:00:30 crc kubenswrapper[4948]: E1204 18:00:30.851027 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerName="registry-server" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851054 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerName="registry-server" Dec 04 18:00:30 crc kubenswrapper[4948]: E1204 18:00:30.851080 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c0787d1-2fd6-4c5c-8e07-44bcbab37320" containerName="openstackclient" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851086 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0787d1-2fd6-4c5c-8e07-44bcbab37320" containerName="openstackclient" Dec 04 18:00:30 crc kubenswrapper[4948]: E1204 18:00:30.851099 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf95293-dc43-49fc-8f3f-259c3b5d2a11" containerName="collect-profiles" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851105 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf95293-dc43-49fc-8f3f-259c3b5d2a11" containerName="collect-profiles" Dec 04 18:00:30 crc kubenswrapper[4948]: E1204 18:00:30.851125 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerName="extract-content" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851132 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerName="extract-content" Dec 04 18:00:30 crc kubenswrapper[4948]: E1204 18:00:30.851143 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerName="extract-utilities" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851149 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerName="extract-utilities" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851325 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c0787d1-2fd6-4c5c-8e07-44bcbab37320" containerName="openstackclient" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851339 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a4c977-360c-4133-b59e-4c3e47319ad0" containerName="registry-server" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851359 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf95293-dc43-49fc-8f3f-259c3b5d2a11" containerName="collect-profiles" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.851898 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.862363 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 18:00:30 crc kubenswrapper[4948]: I1204 18:00:30.907690 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron4bb1-account-delete-4fsjg"] Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.047238 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.048598 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:31.547304429 +0000 UTC m=+2042.908378821 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-scripts" not found Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.048683 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59806891-9fa2-446a-87c1-b7efbf4b692b-operator-scripts\") pod \"neutron4bb1-account-delete-4fsjg\" (UID: \"59806891-9fa2-446a-87c1-b7efbf4b692b\") " pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.068618 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-rzjh8"] Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.070094 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.070146 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:31.570131237 +0000 UTC m=+2042.931205629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-config" not found Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.070679 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fpmf\" (UniqueName: \"kubernetes.io/projected/59806891-9fa2-446a-87c1-b7efbf4b692b-kube-api-access-4fpmf\") pod \"neutron4bb1-account-delete-4fsjg\" (UID: \"59806891-9fa2-446a-87c1-b7efbf4b692b\") " pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.078704 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.078759 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data podName:b34ca165-31d6-44fa-b175-ed2b1bf9f766 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:31.578744143 +0000 UTC m=+2042.939818535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data") pod "rabbitmq-cell1-server-0" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766") : configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.129905 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-kgd58"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.130154 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-kgd58" podUID="64ae0228-b131-4cec-a52f-b5786c22355c" containerName="openstack-network-exporter" containerID="cri-o://5111ccd42a2dcb9a24627bf842d9a0b851e3ae53f8f5be34d0dd24d8c4061014" gracePeriod=30 Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.150179 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bd2ch"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.180093 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance47ce-account-delete-9mwp2"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.183912 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.185904 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fpmf\" (UniqueName: \"kubernetes.io/projected/59806891-9fa2-446a-87c1-b7efbf4b692b-kube-api-access-4fpmf\") pod \"neutron4bb1-account-delete-4fsjg\" (UID: \"59806891-9fa2-446a-87c1-b7efbf4b692b\") " pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.185996 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59806891-9fa2-446a-87c1-b7efbf4b692b-operator-scripts\") pod \"neutron4bb1-account-delete-4fsjg\" (UID: \"59806891-9fa2-446a-87c1-b7efbf4b692b\") " pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.186851 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59806891-9fa2-446a-87c1-b7efbf4b692b-operator-scripts\") pod \"neutron4bb1-account-delete-4fsjg\" (UID: \"59806891-9fa2-446a-87c1-b7efbf4b692b\") " pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.209296 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance47ce-account-delete-9mwp2"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.225408 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fpmf\" (UniqueName: \"kubernetes.io/projected/59806891-9fa2-446a-87c1-b7efbf4b692b-kube-api-access-4fpmf\") pod \"neutron4bb1-account-delete-4fsjg\" (UID: \"59806891-9fa2-446a-87c1-b7efbf4b692b\") " pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.229455 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder20a5-account-delete-b2bnv"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.233543 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.273431 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder20a5-account-delete-b2bnv"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.286308 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hpqvt"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.287536 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9dp\" (UniqueName: \"kubernetes.io/projected/80f9ff11-f145-4e76-a9fc-084de8ccb029-kube-api-access-rr9dp\") pod \"glance47ce-account-delete-9mwp2\" (UID: \"80f9ff11-f145-4e76-a9fc-084de8ccb029\") " pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.287607 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f9ff11-f145-4e76-a9fc-084de8ccb029-operator-scripts\") pod \"glance47ce-account-delete-9mwp2\" (UID: \"80f9ff11-f145-4e76-a9fc-084de8ccb029\") " pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.298725 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hpqvt"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.316532 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicanaee8-account-delete-r5gkz"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.319074 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.328466 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanaee8-account-delete-r5gkz"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.363674 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kgd58_64ae0228-b131-4cec-a52f-b5786c22355c/openstack-network-exporter/0.log" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.363719 4948 generic.go:334] "Generic (PLEG): container finished" podID="64ae0228-b131-4cec-a52f-b5786c22355c" containerID="5111ccd42a2dcb9a24627bf842d9a0b851e3ae53f8f5be34d0dd24d8c4061014" exitCode=2 Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.363758 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kgd58" event={"ID":"64ae0228-b131-4cec-a52f-b5786c22355c","Type":"ContainerDied","Data":"5111ccd42a2dcb9a24627bf842d9a0b851e3ae53f8f5be34d0dd24d8c4061014"} Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.384131 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement7046-account-delete-d78kq"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.385455 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.389054 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpwvg\" (UniqueName: \"kubernetes.io/projected/31e5cc30-bac1-418c-af51-af5cb1d8d595-kube-api-access-vpwvg\") pod \"cinder20a5-account-delete-b2bnv\" (UID: \"31e5cc30-bac1-418c-af51-af5cb1d8d595\") " pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.389115 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9dp\" (UniqueName: \"kubernetes.io/projected/80f9ff11-f145-4e76-a9fc-084de8ccb029-kube-api-access-rr9dp\") pod \"glance47ce-account-delete-9mwp2\" (UID: \"80f9ff11-f145-4e76-a9fc-084de8ccb029\") " pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.389157 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f9ff11-f145-4e76-a9fc-084de8ccb029-operator-scripts\") pod \"glance47ce-account-delete-9mwp2\" (UID: \"80f9ff11-f145-4e76-a9fc-084de8ccb029\") " pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.389191 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e5cc30-bac1-418c-af51-af5cb1d8d595-operator-scripts\") pod \"cinder20a5-account-delete-b2bnv\" (UID: \"31e5cc30-bac1-418c-af51-af5cb1d8d595\") " pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.402126 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f9ff11-f145-4e76-a9fc-084de8ccb029-operator-scripts\") pod \"glance47ce-account-delete-9mwp2\" (UID: \"80f9ff11-f145-4e76-a9fc-084de8ccb029\") " pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.454825 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9dp\" (UniqueName: \"kubernetes.io/projected/80f9ff11-f145-4e76-a9fc-084de8ccb029-kube-api-access-rr9dp\") pod \"glance47ce-account-delete-9mwp2\" (UID: \"80f9ff11-f145-4e76-a9fc-084de8ccb029\") " pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.481527 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement7046-account-delete-d78kq"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.485666 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.492137 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts\") pod \"placement7046-account-delete-d78kq\" (UID: \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\") " pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.495608 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e5cc30-bac1-418c-af51-af5cb1d8d595-operator-scripts\") pod \"cinder20a5-account-delete-b2bnv\" (UID: \"31e5cc30-bac1-418c-af51-af5cb1d8d595\") " pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.495706 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9acee6d3-23af-4793-8e56-8f3fbc169779-operator-scripts\") pod \"barbicanaee8-account-delete-r5gkz\" (UID: \"9acee6d3-23af-4793-8e56-8f3fbc169779\") " pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.495731 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzf26\" (UniqueName: \"kubernetes.io/projected/9acee6d3-23af-4793-8e56-8f3fbc169779-kube-api-access-jzf26\") pod \"barbicanaee8-account-delete-r5gkz\" (UID: \"9acee6d3-23af-4793-8e56-8f3fbc169779\") " pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.495924 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b7hg\" (UniqueName: \"kubernetes.io/projected/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-kube-api-access-5b7hg\") pod \"placement7046-account-delete-d78kq\" (UID: \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\") " pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.496000 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpwvg\" (UniqueName: \"kubernetes.io/projected/31e5cc30-bac1-418c-af51-af5cb1d8d595-kube-api-access-vpwvg\") pod \"cinder20a5-account-delete-b2bnv\" (UID: \"31e5cc30-bac1-418c-af51-af5cb1d8d595\") " pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.502910 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e5cc30-bac1-418c-af51-af5cb1d8d595-operator-scripts\") pod \"cinder20a5-account-delete-b2bnv\" (UID: \"31e5cc30-bac1-418c-af51-af5cb1d8d595\") " pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.515583 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.531035 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpwvg\" (UniqueName: \"kubernetes.io/projected/31e5cc30-bac1-418c-af51-af5cb1d8d595-kube-api-access-vpwvg\") pod \"cinder20a5-account-delete-b2bnv\" (UID: \"31e5cc30-bac1-418c-af51-af5cb1d8d595\") " pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.549358 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jn67c"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.587996 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jn67c"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.600806 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b7hg\" (UniqueName: \"kubernetes.io/projected/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-kube-api-access-5b7hg\") pod \"placement7046-account-delete-d78kq\" (UID: \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\") " pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.600933 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts\") pod \"placement7046-account-delete-d78kq\" (UID: \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\") " pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.601071 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9acee6d3-23af-4793-8e56-8f3fbc169779-operator-scripts\") pod \"barbicanaee8-account-delete-r5gkz\" (UID: \"9acee6d3-23af-4793-8e56-8f3fbc169779\") " pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.601091 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzf26\" (UniqueName: \"kubernetes.io/projected/9acee6d3-23af-4793-8e56-8f3fbc169779-kube-api-access-jzf26\") pod \"barbicanaee8-account-delete-r5gkz\" (UID: \"9acee6d3-23af-4793-8e56-8f3fbc169779\") " pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.601265 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.601316 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data podName:b34ca165-31d6-44fa-b175-ed2b1bf9f766 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:32.601298552 +0000 UTC m=+2043.962372954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data") pod "rabbitmq-cell1-server-0" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766") : configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.601713 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.601749 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:32.601739174 +0000 UTC m=+2043.962813576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-scripts" not found Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.602613 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts\") pod \"placement7046-account-delete-d78kq\" (UID: \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\") " pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.602661 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.602686 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:32.602677528 +0000 UTC m=+2043.963751930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-config" not found Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.603190 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9acee6d3-23af-4793-8e56-8f3fbc169779-operator-scripts\") pod \"barbicanaee8-account-delete-r5gkz\" (UID: \"9acee6d3-23af-4793-8e56-8f3fbc169779\") " pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.634514 4948 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-bd2ch" message=< Dec 04 18:00:31 crc kubenswrapper[4948]: Exiting ovn-controller (1) [ OK ] Dec 04 18:00:31 crc kubenswrapper[4948]: > Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.634551 4948 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-bd2ch" podUID="4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" containerName="ovn-controller" containerID="cri-o://fa492590ec2bee6d8477d4e723bea4207fb5e920790c621ae08dfef6fec716cc" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.634585 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-bd2ch" podUID="4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" containerName="ovn-controller" containerID="cri-o://fa492590ec2bee6d8477d4e723bea4207fb5e920790c621ae08dfef6fec716cc" gracePeriod=30 Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.647475 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b7hg\" (UniqueName: \"kubernetes.io/projected/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-kube-api-access-5b7hg\") pod \"placement7046-account-delete-d78kq\" (UID: \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\") " pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.654964 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.657391 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzf26\" (UniqueName: \"kubernetes.io/projected/9acee6d3-23af-4793-8e56-8f3fbc169779-kube-api-access-jzf26\") pod \"barbicanaee8-account-delete-r5gkz\" (UID: \"9acee6d3-23af-4793-8e56-8f3fbc169779\") " pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.663407 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nccrm"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.694130 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nccrm"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.703365 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.709502 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0a2da-account-delete-2tst9"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.710649 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.734195 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0a2da-account-delete-2tst9"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.754363 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.757936 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.807198 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.807517 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="ovn-northd" containerID="cri-o://60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" gracePeriod=30 Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.808017 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="openstack-network-exporter" containerID="cri-o://27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205" gracePeriod=30 Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.809379 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpft\" (UniqueName: \"kubernetes.io/projected/e1b64e38-8be0-41af-bf89-878d17bbd7a5-kube-api-access-jnpft\") pod \"novacell0a2da-account-delete-2tst9\" (UID: \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\") " pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.809504 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts\") pod \"novacell0a2da-account-delete-2tst9\" (UID: \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\") " pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.822731 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1e276-account-delete-5xw8n"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.824346 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.834787 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1e276-account-delete-5xw8n"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.862641 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapide93-account-delete-s9wkh"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.864276 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.913092 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xxfqv"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.914427 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpft\" (UniqueName: \"kubernetes.io/projected/e1b64e38-8be0-41af-bf89-878d17bbd7a5-kube-api-access-jnpft\") pod \"novacell0a2da-account-delete-2tst9\" (UID: \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\") " pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.914509 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtr9\" (UniqueName: \"kubernetes.io/projected/4d563e18-b478-40af-b4c6-b2dd89ea863a-kube-api-access-4wtr9\") pod \"novacell1e276-account-delete-5xw8n\" (UID: \"4d563e18-b478-40af-b4c6-b2dd89ea863a\") " pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.914583 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d563e18-b478-40af-b4c6-b2dd89ea863a-operator-scripts\") pod \"novacell1e276-account-delete-5xw8n\" (UID: \"4d563e18-b478-40af-b4c6-b2dd89ea863a\") " pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.914618 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts\") pod \"novacell0a2da-account-delete-2tst9\" (UID: \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\") " pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.925961 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 18:00:31 crc kubenswrapper[4948]: E1204 18:00:31.926009 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data podName:90b4baf7-8366-4f47-8515-c33e1b691856 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:32.425993654 +0000 UTC m=+2043.787068056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data") pod "rabbitmq-server-0" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856") : configmap "rabbitmq-config-data" not found Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.926853 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts\") pod \"novacell0a2da-account-delete-2tst9\" (UID: \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\") " pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.934200 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapide93-account-delete-s9wkh"] Dec 04 18:00:31 crc kubenswrapper[4948]: I1204 18:00:31.963118 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xxfqv"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.047417 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpft\" (UniqueName: \"kubernetes.io/projected/e1b64e38-8be0-41af-bf89-878d17bbd7a5-kube-api-access-jnpft\") pod \"novacell0a2da-account-delete-2tst9\" (UID: \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\") " pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.067475 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.072371 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7zdlz"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.077604 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d563e18-b478-40af-b4c6-b2dd89ea863a-operator-scripts\") pod \"novacell1e276-account-delete-5xw8n\" (UID: \"4d563e18-b478-40af-b4c6-b2dd89ea863a\") " pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.077715 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48d8f605-3274-40ec-8a30-8dc188fdcd86-operator-scripts\") pod \"novaapide93-account-delete-s9wkh\" (UID: \"48d8f605-3274-40ec-8a30-8dc188fdcd86\") " pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.077801 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl2gh\" (UniqueName: \"kubernetes.io/projected/48d8f605-3274-40ec-8a30-8dc188fdcd86-kube-api-access-cl2gh\") pod \"novaapide93-account-delete-s9wkh\" (UID: \"48d8f605-3274-40ec-8a30-8dc188fdcd86\") " pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.078017 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtr9\" (UniqueName: \"kubernetes.io/projected/4d563e18-b478-40af-b4c6-b2dd89ea863a-kube-api-access-4wtr9\") pod \"novacell1e276-account-delete-5xw8n\" (UID: \"4d563e18-b478-40af-b4c6-b2dd89ea863a\") " pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.079145 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d563e18-b478-40af-b4c6-b2dd89ea863a-operator-scripts\") pod \"novacell1e276-account-delete-5xw8n\" (UID: \"4d563e18-b478-40af-b4c6-b2dd89ea863a\") " pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.187549 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48d8f605-3274-40ec-8a30-8dc188fdcd86-operator-scripts\") pod \"novaapide93-account-delete-s9wkh\" (UID: \"48d8f605-3274-40ec-8a30-8dc188fdcd86\") " pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.187953 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl2gh\" (UniqueName: \"kubernetes.io/projected/48d8f605-3274-40ec-8a30-8dc188fdcd86-kube-api-access-cl2gh\") pod \"novaapide93-account-delete-s9wkh\" (UID: \"48d8f605-3274-40ec-8a30-8dc188fdcd86\") " pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.188714 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48d8f605-3274-40ec-8a30-8dc188fdcd86-operator-scripts\") pod \"novaapide93-account-delete-s9wkh\" (UID: \"48d8f605-3274-40ec-8a30-8dc188fdcd86\") " pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.217220 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtr9\" (UniqueName: \"kubernetes.io/projected/4d563e18-b478-40af-b4c6-b2dd89ea863a-kube-api-access-4wtr9\") pod \"novacell1e276-account-delete-5xw8n\" (UID: \"4d563e18-b478-40af-b4c6-b2dd89ea863a\") " pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.297394 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7zdlz"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.308501 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b7b8cbd95-z6gmw"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.308788 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b7b8cbd95-z6gmw" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-api" containerID="cri-o://86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.308918 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6b7b8cbd95-z6gmw" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-httpd" containerID="cri-o://7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.316679 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl2gh\" (UniqueName: \"kubernetes.io/projected/48d8f605-3274-40ec-8a30-8dc188fdcd86-kube-api-access-cl2gh\") pod \"novaapide93-account-delete-s9wkh\" (UID: \"48d8f605-3274-40ec-8a30-8dc188fdcd86\") " pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.341599 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2wcvz"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.367721 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-2wcvz"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.395520 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.396570 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="openstack-network-exporter" containerID="cri-o://f9414a835d8c13eaf98608d7189bcb0337dfc8008b5995dd6461769594b0b04b" gracePeriod=300 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.405775 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kgd58_64ae0228-b131-4cec-a52f-b5786c22355c/openstack-network-exporter/0.log" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.405863 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kgd58" event={"ID":"64ae0228-b131-4cec-a52f-b5786c22355c","Type":"ContainerDied","Data":"4f7838c6613c72321ee3bd3a66ad70afa336dcd4e741b7c4ca175872588a253b"} Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.405888 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f7838c6613c72321ee3bd3a66ad70afa336dcd4e741b7c4ca175872588a253b" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.412497 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.413076 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="openstack-network-exporter" containerID="cri-o://055149cc29d5a8a0fbd7d07c17f45bd048d5a2c834bca9a88e2e890d03daf20f" gracePeriod=300 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.414627 4948 generic.go:334] "Generic (PLEG): container finished" podID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerID="27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205" exitCode=2 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.414739 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b6b365e8-6c2a-41fe-b50a-1702144d67d4","Type":"ContainerDied","Data":"27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205"} Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.424834 4948 generic.go:334] "Generic (PLEG): container finished" podID="4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" containerID="fa492590ec2bee6d8477d4e723bea4207fb5e920790c621ae08dfef6fec716cc" exitCode=0 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.424886 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bd2ch" event={"ID":"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1","Type":"ContainerDied","Data":"fa492590ec2bee6d8477d4e723bea4207fb5e920790c621ae08dfef6fec716cc"} Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.430684 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-hnjl5"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.430919 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" podUID="3326569d-4475-4365-8d93-b2b1522b6f60" containerName="dnsmasq-dns" containerID="cri-o://c91a77902e7ba5e05adfd3330e1a213f391e817ac078428b5350fe1e14dbe94b" gracePeriod=10 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.445520 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.445771 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerName="glance-log" containerID="cri-o://4ae447a7f1fce2c6cfb74358e924d30d23ffd33d65e3ced21c1749bddfe8ce91" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.446164 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerName="glance-httpd" containerID="cri-o://db8b9187d0c187cfc911c618a1e41befbb49ce369abd91c26b5274db741964ad" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.474357 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.474960 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerName="glance-log" containerID="cri-o://b058b84e4f67a262a8cae930973840aaf5fda1c3dfc929a04a6794fb308c7d61" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.475706 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerName="glance-httpd" containerID="cri-o://3ed5978b64fee059b95b3f3fcb1a1ab665b53aab15fb25269bdf21eeb866ef81" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.499002 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.499799 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data podName:90b4baf7-8366-4f47-8515-c33e1b691856 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:33.499781326 +0000 UTC m=+2044.860855728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data") pod "rabbitmq-server-0" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856") : configmap "rabbitmq-config-data" not found Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.500253 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.511953 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.512660 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-server" containerID="cri-o://d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.513582 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="swift-recon-cron" containerID="cri-o://29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.516592 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="rsync" containerID="cri-o://a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.516746 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-expirer" containerID="cri-o://ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.516899 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-updater" containerID="cri-o://bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.517020 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-auditor" containerID="cri-o://bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.517138 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-replicator" containerID="cri-o://0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.517264 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-server" containerID="cri-o://716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.517431 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-updater" containerID="cri-o://cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.517563 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-auditor" containerID="cri-o://d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.517677 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-replicator" containerID="cri-o://bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.517791 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-server" containerID="cri-o://014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.517936 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-reaper" containerID="cri-o://86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.518078 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-auditor" containerID="cri-o://5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.518178 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-replicator" containerID="cri-o://07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.533865 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-2qm6f"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.550478 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-2qm6f"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.559072 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.562960 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-cml8d"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.581163 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-cml8d"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.591412 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="ovsdbserver-nb" containerID="cri-o://0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544" gracePeriod=300 Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.601798 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.601883 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data podName:b34ca165-31d6-44fa-b175-ed2b1bf9f766 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:34.601862862 +0000 UTC m=+2045.962937264 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data") pod "rabbitmq-cell1-server-0" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766") : configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.618417 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.618655 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerName="cinder-scheduler" containerID="cri-o://843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.619087 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerName="probe" containerID="cri-o://2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.624498 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="ovsdbserver-sb" containerID="cri-o://568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c" gracePeriod=300 Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.639243 4948 log.go:32] "ExecSync cmd from runtime service failed" err=< Dec 04 18:00:32 crc kubenswrapper[4948]: rpc error: code = Unknown desc = command error: setns `mnt`: Bad file descriptor Dec 04 18:00:32 crc kubenswrapper[4948]: fail startup Dec 04 18:00:32 crc kubenswrapper[4948]: , stdout: , stderr: , exit code -1 Dec 04 18:00:32 crc kubenswrapper[4948]: > containerID="568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.642284 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c is running failed: container process not found" containerID="568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.642707 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c is running failed: container process not found" containerID="568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.642813 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="ovsdbserver-sb" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.666732 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.667019 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api-log" containerID="cri-o://9bcf71c31ca1e73a965969fb92ddba24201c5c11e6c3cb3b4e92e77ecdd4bf87" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.667460 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api" containerID="cri-o://637497b65838d4e1875162878d30bf8895cfbdd36b9fd9f4596de491cb8f3761" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.746262 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.746339 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:34.746320249 +0000 UTC m=+2046.107394721 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-scripts" not found Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.746822 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Dec 04 18:00:32 crc kubenswrapper[4948]: E1204 18:00:32.746864 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:34.746855533 +0000 UTC m=+2046.107929935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-config" not found Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.764457 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d56c8fbdd-fr7fc"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.764759 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-d56c8fbdd-fr7fc" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerName="placement-log" containerID="cri-o://648062ad89f6bf56de82ee3bd52951b86df3e900c322ee6b0c57f21f40ba73d8" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.766641 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-d56c8fbdd-fr7fc" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerName="placement-api" containerID="cri-o://228923eb8a21101983b2bce76096ee36269db4118b9d4b8fa58a9ef47c3110a3" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.780421 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6b7b8cbd95-z6gmw" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9696/\": dial tcp 10.217.0.151:9696: connect: connection refused" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.861594 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.891321 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.902322 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-log" containerID="cri-o://87d492aabd3820482e4066aa4ce2c353d0f6250c208d12c75d4f38c935248ea8" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.902502 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-metadata" containerID="cri-o://f3c7b7339517046484e3d5e33d506a76290c1be3ff41874ab17e7a9348fa892a" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.955645 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="195f5ec9-3622-48de-931e-9205f34910b0" path="/var/lib/kubelet/pods/195f5ec9-3622-48de-931e-9205f34910b0/volumes" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.956490 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd09899-e64d-4b12-b604-dcd87d9c868b" path="/var/lib/kubelet/pods/1bd09899-e64d-4b12-b604-dcd87d9c868b/volumes" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.956996 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b81424a-68f9-40e6-bd32-a932a675578a" path="/var/lib/kubelet/pods/2b81424a-68f9-40e6-bd32-a932a675578a/volumes" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.957569 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55887774-d332-4083-8f3c-6281330114cd" path="/var/lib/kubelet/pods/55887774-d332-4083-8f3c-6281330114cd/volumes" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.960452 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c2276fc-7b3c-4113-abb2-4e2558c9dc03" path="/var/lib/kubelet/pods/7c2276fc-7b3c-4113-abb2-4e2558c9dc03/volumes" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.961004 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8913b68d-4b7f-4a2e-b097-a60b0f557827" path="/var/lib/kubelet/pods/8913b68d-4b7f-4a2e-b097-a60b0f557827/volumes" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.961503 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c608c956-a885-4f52-8f3c-24e9f5283cb3" path="/var/lib/kubelet/pods/c608c956-a885-4f52-8f3c-24e9f5283cb3/volumes" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.962567 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74958a4-caed-4579-b0ff-cbabe46b09dd" path="/var/lib/kubelet/pods/c74958a4-caed-4579-b0ff-cbabe46b09dd/volumes" Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.963153 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.969844 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.970072 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-log" containerID="cri-o://7ac6688f04690776901f6c203c84667f87067c95bbc8e52dbe5b9d9106e8a071" gracePeriod=30 Dec 04 18:00:32 crc kubenswrapper[4948]: I1204 18:00:32.970227 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-api" containerID="cri-o://12e734767396eb518b40a349afac5356ca256b1870d63b32a4acf4a594db67b0" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:32.995916 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" containerID="cri-o://8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" gracePeriod=29 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.077562 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6dbb7d984c-hzlwz"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.077776 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerName="barbican-worker-log" containerID="cri-o://ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.078270 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerName="barbican-worker" containerID="cri-o://f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.083576 4948 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 04 18:00:33 crc kubenswrapper[4948]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 04 18:00:33 crc kubenswrapper[4948]: + source /usr/local/bin/container-scripts/functions Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNBridge=br-int Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNRemote=tcp:localhost:6642 Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNEncapType=geneve Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNAvailabilityZones= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ EnableChassisAsGateway=true Dec 04 18:00:33 crc kubenswrapper[4948]: ++ PhysicalNetworks= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNHostName= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 04 18:00:33 crc kubenswrapper[4948]: ++ ovs_dir=/var/lib/openvswitch Dec 04 18:00:33 crc kubenswrapper[4948]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 04 18:00:33 crc kubenswrapper[4948]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 04 18:00:33 crc kubenswrapper[4948]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + cleanup_ovsdb_server_semaphore Dec 04 18:00:33 crc kubenswrapper[4948]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 18:00:33 crc kubenswrapper[4948]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 04 18:00:33 crc kubenswrapper[4948]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-rzjh8" message=< Dec 04 18:00:33 crc kubenswrapper[4948]: Exiting ovsdb-server (5) [ OK ] Dec 04 18:00:33 crc kubenswrapper[4948]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 04 18:00:33 crc kubenswrapper[4948]: + source /usr/local/bin/container-scripts/functions Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNBridge=br-int Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNRemote=tcp:localhost:6642 Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNEncapType=geneve Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNAvailabilityZones= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ EnableChassisAsGateway=true Dec 04 18:00:33 crc kubenswrapper[4948]: ++ PhysicalNetworks= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNHostName= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 04 18:00:33 crc kubenswrapper[4948]: ++ ovs_dir=/var/lib/openvswitch Dec 04 18:00:33 crc kubenswrapper[4948]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 04 18:00:33 crc kubenswrapper[4948]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 04 18:00:33 crc kubenswrapper[4948]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + cleanup_ovsdb_server_semaphore Dec 04 18:00:33 crc kubenswrapper[4948]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 18:00:33 crc kubenswrapper[4948]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 04 18:00:33 crc kubenswrapper[4948]: > Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.086323 4948 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 04 18:00:33 crc kubenswrapper[4948]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 04 18:00:33 crc kubenswrapper[4948]: + source /usr/local/bin/container-scripts/functions Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNBridge=br-int Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNRemote=tcp:localhost:6642 Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNEncapType=geneve Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNAvailabilityZones= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ EnableChassisAsGateway=true Dec 04 18:00:33 crc kubenswrapper[4948]: ++ PhysicalNetworks= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ OVNHostName= Dec 04 18:00:33 crc kubenswrapper[4948]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 04 18:00:33 crc kubenswrapper[4948]: ++ ovs_dir=/var/lib/openvswitch Dec 04 18:00:33 crc kubenswrapper[4948]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 04 18:00:33 crc kubenswrapper[4948]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 04 18:00:33 crc kubenswrapper[4948]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + sleep 0.5 Dec 04 18:00:33 crc kubenswrapper[4948]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 18:00:33 crc kubenswrapper[4948]: + cleanup_ovsdb_server_semaphore Dec 04 18:00:33 crc kubenswrapper[4948]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 18:00:33 crc kubenswrapper[4948]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 04 18:00:33 crc kubenswrapper[4948]: > pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" containerID="cri-o://8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.087573 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" containerID="cri-o://8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" gracePeriod=28 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.159033 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerName="rabbitmq" containerID="cri-o://de019385e7338481198dc33686e0126bb41672f2effc6fd4c866ef06770f14f7" gracePeriod=604800 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.166127 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-54fb4df596-9xk9m"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.166395 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerName="barbican-keystone-listener-log" containerID="cri-o://b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.166862 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerName="barbican-keystone-listener" containerID="cri-o://ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.207001 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-z2hx2"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.279017 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e276-account-create-update-99s2v"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.292843 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e276-account-create-update-99s2v"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.300254 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-z2hx2"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.307274 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1e276-account-delete-5xw8n"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.315798 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d5f54fb74-68pcc"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.316019 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d5f54fb74-68pcc" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api-log" containerID="cri-o://b6f89e0991c69b6cee27c2fa1ab82523519a1bb22b60f0d9a6b4e7f17f7f22bc" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.316636 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d5f54fb74-68pcc" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api" containerID="cri-o://6b974c2237fa21ad81c9fc52a94fd34f294e0f3775cb43b085b2410092be36d4" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.348095 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.348381 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6458efcd-4f47-46a1-92ab-3f1c77035cce" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://81a88ee925738bec54ae478b0412037a10331c49f88928f7e3a4b1b1fb31441f" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.413404 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.413649 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="214441b7-69b1-4518-a135-73de11d39a1d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.435405 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xbz9m"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.505860 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="10997b06-2476-4c6c-865d-1e5927e75fac" containerName="galera" containerID="cri-o://c6364a91b688011f494239085545a963704364e17364c1672c50b66b56b55484" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.517387 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-xbz9m"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.536809 4948 generic.go:334] "Generic (PLEG): container finished" podID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerID="648062ad89f6bf56de82ee3bd52951b86df3e900c322ee6b0c57f21f40ba73d8" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.537874 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d56c8fbdd-fr7fc" event={"ID":"117c809e-76fd-458e-acbf-e2f6ce2d2f43","Type":"ContainerDied","Data":"648062ad89f6bf56de82ee3bd52951b86df3e900c322ee6b0c57f21f40ba73d8"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.537938 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p2zk5"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.539887 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-p2zk5"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.541632 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.541721 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data podName:90b4baf7-8366-4f47-8515-c33e1b691856 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:35.54170134 +0000 UTC m=+2046.902775742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data") pod "rabbitmq-server-0" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856") : configmap "rabbitmq-config-data" not found Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.542513 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6840a402-94d3-48e6-9ccb-d578573e430a/ovsdbserver-sb/0.log" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.542590 4948 generic.go:334] "Generic (PLEG): container finished" podID="6840a402-94d3-48e6-9ccb-d578573e430a" containerID="055149cc29d5a8a0fbd7d07c17f45bd048d5a2c834bca9a88e2e890d03daf20f" exitCode=2 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.542613 4948 generic.go:334] "Generic (PLEG): container finished" podID="6840a402-94d3-48e6-9ccb-d578573e430a" containerID="568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.542812 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6840a402-94d3-48e6-9ccb-d578573e430a","Type":"ContainerDied","Data":"055149cc29d5a8a0fbd7d07c17f45bd048d5a2c834bca9a88e2e890d03daf20f"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.542854 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6840a402-94d3-48e6-9ccb-d578573e430a","Type":"ContainerDied","Data":"568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.545734 4948 generic.go:334] "Generic (PLEG): container finished" podID="3326569d-4475-4365-8d93-b2b1522b6f60" containerID="c91a77902e7ba5e05adfd3330e1a213f391e817ac078428b5350fe1e14dbe94b" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.545850 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" event={"ID":"3326569d-4475-4365-8d93-b2b1522b6f60","Type":"ContainerDied","Data":"c91a77902e7ba5e05adfd3330e1a213f391e817ac078428b5350fe1e14dbe94b"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.549775 4948 generic.go:334] "Generic (PLEG): container finished" podID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerID="7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.549853 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7b8cbd95-z6gmw" event={"ID":"0fc74dcc-f8d8-4852-913a-77cb4526eed7","Type":"ContainerDied","Data":"7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.557344 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.558197 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e318bac5-87da-4a9b-9d73-8065c65f4b61" containerName="nova-cell1-conductor-conductor" containerID="cri-o://4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.572888 4948 generic.go:334] "Generic (PLEG): container finished" podID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerID="ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.573023 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" event={"ID":"c94e22e0-c0d1-4233-b21c-9860d204c068","Type":"ContainerDied","Data":"ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.580160 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0cc3ac35-04df-4516-8623-b6a0d855c98a/ovsdbserver-nb/0.log" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.580373 4948 generic.go:334] "Generic (PLEG): container finished" podID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerID="f9414a835d8c13eaf98608d7189bcb0337dfc8008b5995dd6461769594b0b04b" exitCode=2 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.580451 4948 generic.go:334] "Generic (PLEG): container finished" podID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerID="0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.580557 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cc3ac35-04df-4516-8623-b6a0d855c98a","Type":"ContainerDied","Data":"f9414a835d8c13eaf98608d7189bcb0337dfc8008b5995dd6461769594b0b04b"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.580729 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cc3ac35-04df-4516-8623-b6a0d855c98a","Type":"ContainerDied","Data":"0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.604378 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.614566 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.614777 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bbda827a-8528-4b7f-8d4c-70fe8be65d27" containerName="nova-scheduler-scheduler" containerID="cri-o://72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" gracePeriod=30 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.637417 4948 generic.go:334] "Generic (PLEG): container finished" podID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerID="87d492aabd3820482e4066aa4ce2c353d0f6250c208d12c75d4f38c935248ea8" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.637512 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60b408db-1dec-49e0-8212-1193d4fe6a37","Type":"ContainerDied","Data":"87d492aabd3820482e4066aa4ce2c353d0f6250c208d12c75d4f38c935248ea8"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.666104 4948 generic.go:334] "Generic (PLEG): container finished" podID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerID="b6f89e0991c69b6cee27c2fa1ab82523519a1bb22b60f0d9a6b4e7f17f7f22bc" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.666190 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d5f54fb74-68pcc" event={"ID":"be3e0d09-a01a-4f1c-9fbd-60a23a823e31","Type":"ContainerDied","Data":"b6f89e0991c69b6cee27c2fa1ab82523519a1bb22b60f0d9a6b4e7f17f7f22bc"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.667947 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kgd58_64ae0228-b131-4cec-a52f-b5786c22355c/openstack-network-exporter/0.log" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.668011 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kgd58" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.669593 4948 generic.go:334] "Generic (PLEG): container finished" podID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerID="7ac6688f04690776901f6c203c84667f87067c95bbc8e52dbe5b9d9106e8a071" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.669643 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3","Type":"ContainerDied","Data":"7ac6688f04690776901f6c203c84667f87067c95bbc8e52dbe5b9d9106e8a071"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.683284 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c0787d1-2fd6-4c5c-8e07-44bcbab37320" containerID="d9a6ef11482121f4a842966dc02f23660d77b8786023f6691bf3ecec31db0c0c" exitCode=137 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.697411 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bd2ch" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.699789 4948 generic.go:334] "Generic (PLEG): container finished" podID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerID="b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.699849 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" event={"ID":"e905edc7-cd78-48c2-9192-fb18e1d193ac","Type":"ContainerDied","Data":"b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.700377 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="rabbitmq" containerID="cri-o://ce3cf731c06ee83c40bae89c0c8e62893dd7be16f5ea71cde48d876fb17f3f41" gracePeriod=604800 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.702970 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.726933 4948 generic.go:334] "Generic (PLEG): container finished" podID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerID="4ae447a7f1fce2c6cfb74358e924d30d23ffd33d65e3ced21c1749bddfe8ce91" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.727105 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c08574c-af0f-4e7c-81af-b180b29ce4ee","Type":"ContainerDied","Data":"4ae447a7f1fce2c6cfb74358e924d30d23ffd33d65e3ced21c1749bddfe8ce91"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755023 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hzjg\" (UniqueName: \"kubernetes.io/projected/64ae0228-b131-4cec-a52f-b5786c22355c-kube-api-access-4hzjg\") pod \"64ae0228-b131-4cec-a52f-b5786c22355c\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755118 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovs-rundir\") pod \"64ae0228-b131-4cec-a52f-b5786c22355c\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755159 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-log-ovn\") pod \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755180 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-combined-ca-bundle\") pod \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755236 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-combined-ca-bundle\") pod \"64ae0228-b131-4cec-a52f-b5786c22355c\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755254 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run-ovn\") pod \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755294 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run\") pod \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755340 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-swift-storage-0\") pod \"3326569d-4475-4365-8d93-b2b1522b6f60\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755414 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-nb\") pod \"3326569d-4475-4365-8d93-b2b1522b6f60\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.755443 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-config\") pod \"3326569d-4475-4365-8d93-b2b1522b6f60\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756221 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-766q6\" (UniqueName: \"kubernetes.io/projected/3326569d-4475-4365-8d93-b2b1522b6f60-kube-api-access-766q6\") pod \"3326569d-4475-4365-8d93-b2b1522b6f60\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756313 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovn-rundir\") pod \"64ae0228-b131-4cec-a52f-b5786c22355c\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756337 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-ovn-controller-tls-certs\") pod \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756368 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-scripts\") pod \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756418 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-sb\") pod \"3326569d-4475-4365-8d93-b2b1522b6f60\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756551 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-metrics-certs-tls-certs\") pod \"64ae0228-b131-4cec-a52f-b5786c22355c\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756628 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5p2x\" (UniqueName: \"kubernetes.io/projected/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-kube-api-access-v5p2x\") pod \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\" (UID: \"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756674 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ae0228-b131-4cec-a52f-b5786c22355c-config\") pod \"64ae0228-b131-4cec-a52f-b5786c22355c\" (UID: \"64ae0228-b131-4cec-a52f-b5786c22355c\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.756693 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-svc\") pod \"3326569d-4475-4365-8d93-b2b1522b6f60\" (UID: \"3326569d-4475-4365-8d93-b2b1522b6f60\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.761816 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-scripts" (OuterVolumeSpecName: "scripts") pod "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" (UID: "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.780773 4948 generic.go:334] "Generic (PLEG): container finished" podID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.784801 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ae0228-b131-4cec-a52f-b5786c22355c-config" (OuterVolumeSpecName: "config") pod "64ae0228-b131-4cec-a52f-b5786c22355c" (UID: "64ae0228-b131-4cec-a52f-b5786c22355c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.784880 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "64ae0228-b131-4cec-a52f-b5786c22355c" (UID: "64ae0228-b131-4cec-a52f-b5786c22355c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.784916 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run" (OuterVolumeSpecName: "var-run") pod "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" (UID: "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.784939 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" (UID: "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.784967 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" (UID: "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.784999 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "64ae0228-b131-4cec-a52f-b5786c22355c" (UID: "64ae0228-b131-4cec-a52f-b5786c22355c"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.787438 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rzjh8" event={"ID":"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d","Type":"ContainerDied","Data":"8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.791379 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ae0228-b131-4cec-a52f-b5786c22355c-kube-api-access-4hzjg" (OuterVolumeSpecName: "kube-api-access-4hzjg") pod "64ae0228-b131-4cec-a52f-b5786c22355c" (UID: "64ae0228-b131-4cec-a52f-b5786c22355c"). InnerVolumeSpecName "kube-api-access-4hzjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.797554 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.798294 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-kube-api-access-v5p2x" (OuterVolumeSpecName: "kube-api-access-v5p2x") pod "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" (UID: "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1"). InnerVolumeSpecName "kube-api-access-v5p2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.798643 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.798698 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.802007 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.802078 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.810307 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.810397 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.815273 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.815576 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.815601 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="ovn-northd" Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.821300 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:33 crc kubenswrapper[4948]: E1204 18:00:33.821384 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.821426 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3326569d-4475-4365-8d93-b2b1522b6f60-kube-api-access-766q6" (OuterVolumeSpecName: "kube-api-access-766q6") pod "3326569d-4475-4365-8d93-b2b1522b6f60" (UID: "3326569d-4475-4365-8d93-b2b1522b6f60"). InnerVolumeSpecName "kube-api-access-766q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.822342 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="6458efcd-4f47-46a1-92ab-3f1c77035cce" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.197:6080/vnc_lite.html\": dial tcp 10.217.0.197:6080: connect: connection refused" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834371 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834410 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834419 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834429 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834437 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834444 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834451 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834460 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834659 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834666 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834680 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834686 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834694 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834702 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a" exitCode=0 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834454 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834770 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834806 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834821 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834835 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834848 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834861 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834874 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834885 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834897 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834912 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834925 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834937 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.834949 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.840443 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-bd2ch" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.840457 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-bd2ch" event={"ID":"4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1","Type":"ContainerDied","Data":"80e6d672daf9de3de1ae6621a0dc3ac67417142cd01056d8ce7168d193af2efb"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.840716 4948 scope.go:117] "RemoveContainer" containerID="fa492590ec2bee6d8477d4e723bea4207fb5e920790c621ae08dfef6fec716cc" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.844492 4948 generic.go:334] "Generic (PLEG): container finished" podID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerID="9bcf71c31ca1e73a965969fb92ddba24201c5c11e6c3cb3b4e92e77ecdd4bf87" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.844587 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd","Type":"ContainerDied","Data":"9bcf71c31ca1e73a965969fb92ddba24201c5c11e6c3cb3b4e92e77ecdd4bf87"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.849163 4948 generic.go:334] "Generic (PLEG): container finished" podID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerID="b058b84e4f67a262a8cae930973840aaf5fda1c3dfc929a04a6794fb308c7d61" exitCode=143 Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.849204 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c881bee3-e2f3-4da4-a12f-00db430e4323","Type":"ContainerDied","Data":"b058b84e4f67a262a8cae930973840aaf5fda1c3dfc929a04a6794fb308c7d61"} Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859541 4948 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859569 4948 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859577 4948 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859587 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-766q6\" (UniqueName: \"kubernetes.io/projected/3326569d-4475-4365-8d93-b2b1522b6f60-kube-api-access-766q6\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859596 4948 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859604 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859612 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5p2x\" (UniqueName: \"kubernetes.io/projected/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-kube-api-access-v5p2x\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859622 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64ae0228-b131-4cec-a52f-b5786c22355c-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859631 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hzjg\" (UniqueName: \"kubernetes.io/projected/64ae0228-b131-4cec-a52f-b5786c22355c-kube-api-access-4hzjg\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.859639 4948 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/64ae0228-b131-4cec-a52f-b5786c22355c-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.880231 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.927030 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" (UID: "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.928511 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3326569d-4475-4365-8d93-b2b1522b6f60" (UID: "3326569d-4475-4365-8d93-b2b1522b6f60"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.961397 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config\") pod \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.961686 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-combined-ca-bundle\") pod \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.961786 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdrlm\" (UniqueName: \"kubernetes.io/projected/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-kube-api-access-jdrlm\") pod \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.964204 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config-secret\") pod \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\" (UID: \"9c0787d1-2fd6-4c5c-8e07-44bcbab37320\") " Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.965153 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:33 crc kubenswrapper[4948]: I1204 18:00:33.965175 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.001489 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-kube-api-access-jdrlm" (OuterVolumeSpecName: "kube-api-access-jdrlm") pod "9c0787d1-2fd6-4c5c-8e07-44bcbab37320" (UID: "9c0787d1-2fd6-4c5c-8e07-44bcbab37320"). InnerVolumeSpecName "kube-api-access-jdrlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.012356 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdac4fb3_a888_4781_b1e0_99630c84fe0f.slice/crio-conmon-2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdac4fb3_a888_4781_b1e0_99630c84fe0f.slice/crio-843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdac4fb3_a888_4781_b1e0_99630c84fe0f.slice/crio-conmon-843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdac4fb3_a888_4781_b1e0_99630c84fe0f.slice/crio-2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265.scope\": RecentStats: unable to find data in memory cache]" Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.032493 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544 is running failed: container process not found" containerID="0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.032752 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544 is running failed: container process not found" containerID="0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.032931 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544 is running failed: container process not found" containerID="0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.032962 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="ovsdbserver-nb" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.057546 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-79fbd4d98c-8tdt7"] Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.057829 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-httpd" containerID="cri-o://ab060077462d8ce9f643db68a3d7c266453bc9728c23786c15ef088ebea997bf" gracePeriod=30 Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.058243 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-server" containerID="cri-o://a43a2bd6b5a97a2d0c5ce373003143582211cc9d3bf7982dbc10ceb659d870a6" gracePeriod=30 Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.060142 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3326569d-4475-4365-8d93-b2b1522b6f60" (UID: "3326569d-4475-4365-8d93-b2b1522b6f60"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.067750 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdrlm\" (UniqueName: \"kubernetes.io/projected/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-kube-api-access-jdrlm\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.067787 4948 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.094073 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9c0787d1-2fd6-4c5c-8e07-44bcbab37320" (UID: "9c0787d1-2fd6-4c5c-8e07-44bcbab37320"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.100364 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c0787d1-2fd6-4c5c-8e07-44bcbab37320" (UID: "9c0787d1-2fd6-4c5c-8e07-44bcbab37320"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.122225 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64ae0228-b131-4cec-a52f-b5786c22355c" (UID: "64ae0228-b131-4cec-a52f-b5786c22355c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.131357 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3326569d-4475-4365-8d93-b2b1522b6f60" (UID: "3326569d-4475-4365-8d93-b2b1522b6f60"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.167499 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0cc3ac35-04df-4516-8623-b6a0d855c98a/ovsdbserver-nb/0.log" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.167573 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.169371 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.169393 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.169404 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.169414 4948 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.172621 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3326569d-4475-4365-8d93-b2b1522b6f60" (UID: "3326569d-4475-4365-8d93-b2b1522b6f60"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.177415 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-config" (OuterVolumeSpecName: "config") pod "3326569d-4475-4365-8d93-b2b1522b6f60" (UID: "3326569d-4475-4365-8d93-b2b1522b6f60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.232247 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9c0787d1-2fd6-4c5c-8e07-44bcbab37320" (UID: "9c0787d1-2fd6-4c5c-8e07-44bcbab37320"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.232853 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "64ae0228-b131-4cec-a52f-b5786c22355c" (UID: "64ae0228-b131-4cec-a52f-b5786c22355c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.234869 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" (UID: "4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.257421 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.257512 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.257700 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6840a402-94d3-48e6-9ccb-d578573e430a/ovsdbserver-sb/0.log" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.257783 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.270760 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"0cc3ac35-04df-4516-8623-b6a0d855c98a\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.270830 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6840a402-94d3-48e6-9ccb-d578573e430a\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.270871 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-metrics-certs-tls-certs\") pod \"6840a402-94d3-48e6-9ccb-d578573e430a\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.270893 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdbserver-nb-tls-certs\") pod \"0cc3ac35-04df-4516-8623-b6a0d855c98a\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.270928 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd7sf\" (UniqueName: \"kubernetes.io/projected/6840a402-94d3-48e6-9ccb-d578573e430a-kube-api-access-sd7sf\") pod \"6840a402-94d3-48e6-9ccb-d578573e430a\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271004 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdb-rundir\") pod \"6840a402-94d3-48e6-9ccb-d578573e430a\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271029 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-scripts\") pod \"6840a402-94d3-48e6-9ccb-d578573e430a\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271062 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-scripts\") pod \"0cc3ac35-04df-4516-8623-b6a0d855c98a\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271088 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdbserver-sb-tls-certs\") pod \"6840a402-94d3-48e6-9ccb-d578573e430a\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271122 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-config\") pod \"0cc3ac35-04df-4516-8623-b6a0d855c98a\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271151 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-config\") pod \"6840a402-94d3-48e6-9ccb-d578573e430a\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271167 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-combined-ca-bundle\") pod \"0cc3ac35-04df-4516-8623-b6a0d855c98a\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271192 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-combined-ca-bundle\") pod \"6840a402-94d3-48e6-9ccb-d578573e430a\" (UID: \"6840a402-94d3-48e6-9ccb-d578573e430a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271215 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mrv9\" (UniqueName: \"kubernetes.io/projected/0cc3ac35-04df-4516-8623-b6a0d855c98a-kube-api-access-2mrv9\") pod \"0cc3ac35-04df-4516-8623-b6a0d855c98a\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271239 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdb-rundir\") pod \"0cc3ac35-04df-4516-8623-b6a0d855c98a\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271304 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-metrics-certs-tls-certs\") pod \"0cc3ac35-04df-4516-8623-b6a0d855c98a\" (UID: \"0cc3ac35-04df-4516-8623-b6a0d855c98a\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271678 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271695 4948 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271705 4948 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9c0787d1-2fd6-4c5c-8e07-44bcbab37320-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271717 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ae0228-b131-4cec-a52f-b5786c22355c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.271726 4948 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3326569d-4475-4365-8d93-b2b1522b6f60-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.273162 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-scripts" (OuterVolumeSpecName: "scripts") pod "0cc3ac35-04df-4516-8623-b6a0d855c98a" (UID: "0cc3ac35-04df-4516-8623-b6a0d855c98a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.279807 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "0cc3ac35-04df-4516-8623-b6a0d855c98a" (UID: "0cc3ac35-04df-4516-8623-b6a0d855c98a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.280421 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-config" (OuterVolumeSpecName: "config") pod "0cc3ac35-04df-4516-8623-b6a0d855c98a" (UID: "0cc3ac35-04df-4516-8623-b6a0d855c98a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.281393 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6840a402-94d3-48e6-9ccb-d578573e430a" (UID: "6840a402-94d3-48e6-9ccb-d578573e430a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.284310 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-config" (OuterVolumeSpecName: "config") pod "6840a402-94d3-48e6-9ccb-d578573e430a" (UID: "6840a402-94d3-48e6-9ccb-d578573e430a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.284409 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "0cc3ac35-04df-4516-8623-b6a0d855c98a" (UID: "0cc3ac35-04df-4516-8623-b6a0d855c98a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.284496 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc3ac35-04df-4516-8623-b6a0d855c98a-kube-api-access-2mrv9" (OuterVolumeSpecName: "kube-api-access-2mrv9") pod "0cc3ac35-04df-4516-8623-b6a0d855c98a" (UID: "0cc3ac35-04df-4516-8623-b6a0d855c98a"). InnerVolumeSpecName "kube-api-access-2mrv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.287137 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "6840a402-94d3-48e6-9ccb-d578573e430a" (UID: "6840a402-94d3-48e6-9ccb-d578573e430a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.287395 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-scripts" (OuterVolumeSpecName: "scripts") pod "6840a402-94d3-48e6-9ccb-d578573e430a" (UID: "6840a402-94d3-48e6-9ccb-d578573e430a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.311820 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6840a402-94d3-48e6-9ccb-d578573e430a-kube-api-access-sd7sf" (OuterVolumeSpecName: "kube-api-access-sd7sf") pod "6840a402-94d3-48e6-9ccb-d578573e430a" (UID: "6840a402-94d3-48e6-9ccb-d578573e430a"). InnerVolumeSpecName "kube-api-access-sd7sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.324680 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.326901 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.331792 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cc3ac35-04df-4516-8623-b6a0d855c98a" (UID: "0cc3ac35-04df-4516-8623-b6a0d855c98a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.345545 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.345696 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bbda827a-8528-4b7f-8d4c-70fe8be65d27" containerName="nova-scheduler-scheduler" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.374773 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375002 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375130 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd7sf\" (UniqueName: \"kubernetes.io/projected/6840a402-94d3-48e6-9ccb-d578573e430a-kube-api-access-sd7sf\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375208 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375275 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375336 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375401 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cc3ac35-04df-4516-8623-b6a0d855c98a-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375466 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6840a402-94d3-48e6-9ccb-d578573e430a-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375540 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375604 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mrv9\" (UniqueName: \"kubernetes.io/projected/0cc3ac35-04df-4516-8623-b6a0d855c98a-kube-api-access-2mrv9\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.375669 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.439511 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6840a402-94d3-48e6-9ccb-d578573e430a" (UID: "6840a402-94d3-48e6-9ccb-d578573e430a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.473768 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.478818 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.478981 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.499016 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.508385 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "6840a402-94d3-48e6-9ccb-d578573e430a" (UID: "6840a402-94d3-48e6-9ccb-d578573e430a"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.535312 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6840a402-94d3-48e6-9ccb-d578573e430a" (UID: "6840a402-94d3-48e6-9ccb-d578573e430a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.535422 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "0cc3ac35-04df-4516-8623-b6a0d855c98a" (UID: "0cc3ac35-04df-4516-8623-b6a0d855c98a"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.543251 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "0cc3ac35-04df-4516-8623-b6a0d855c98a" (UID: "0cc3ac35-04df-4516-8623-b6a0d855c98a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.581940 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.581970 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.581979 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.581988 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6840a402-94d3-48e6-9ccb-d578573e430a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.581997 4948 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc3ac35-04df-4516-8623-b6a0d855c98a-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.684699 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.685289 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data podName:b34ca165-31d6-44fa-b175-ed2b1bf9f766 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:38.685234457 +0000 UTC m=+2050.046308869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data") pod "rabbitmq-cell1-server-0" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766") : configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.720479 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-bd2ch"] Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.728817 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-bd2ch"] Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.786718 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.786751 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.786799 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:38.786780109 +0000 UTC m=+2050.147854521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-scripts" not found Dec 04 18:00:34 crc kubenswrapper[4948]: E1204 18:00:34.786868 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:38.786840131 +0000 UTC m=+2050.147914563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-config" not found Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.813860 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.862919 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0cc3ac35-04df-4516-8623-b6a0d855c98a/ovsdbserver-nb/0.log" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.863079 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.864630 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0cc3ac35-04df-4516-8623-b6a0d855c98a","Type":"ContainerDied","Data":"74bd34b58b9a2c2f329ec31a6499e5e08fd2b9554eb628f3b488116ce2c47d7d"} Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.864688 4948 scope.go:117] "RemoveContainer" containerID="f9414a835d8c13eaf98608d7189bcb0337dfc8008b5995dd6461769594b0b04b" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.876528 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6840a402-94d3-48e6-9ccb-d578573e430a/ovsdbserver-sb/0.log" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.876624 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6840a402-94d3-48e6-9ccb-d578573e430a","Type":"ContainerDied","Data":"fc2ebaecc87fa15fe23f8b47df9c3ae931281d7dcb5e5c98d32e936e8f1b9a17"} Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.876721 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.884168 4948 generic.go:334] "Generic (PLEG): container finished" podID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerID="a43a2bd6b5a97a2d0c5ce373003143582211cc9d3bf7982dbc10ceb659d870a6" exitCode=0 Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.884201 4948 generic.go:334] "Generic (PLEG): container finished" podID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerID="ab060077462d8ce9f643db68a3d7c266453bc9728c23786c15ef088ebea997bf" exitCode=0 Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.884254 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" event={"ID":"89ecb28d-b878-4b16-a46a-9d9be1441aca","Type":"ContainerDied","Data":"a43a2bd6b5a97a2d0c5ce373003143582211cc9d3bf7982dbc10ceb659d870a6"} Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.884278 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" event={"ID":"89ecb28d-b878-4b16-a46a-9d9be1441aca","Type":"ContainerDied","Data":"ab060077462d8ce9f643db68a3d7c266453bc9728c23786c15ef088ebea997bf"} Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.887648 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdac4fb3-a888-4781-b1e0-99630c84fe0f-etc-machine-id\") pod \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.887833 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data\") pod \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.887917 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz2cw\" (UniqueName: \"kubernetes.io/projected/cdac4fb3-a888-4781-b1e0-99630c84fe0f-kube-api-access-jz2cw\") pod \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.887989 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-scripts\") pod \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.888227 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-combined-ca-bundle\") pod \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.888276 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data-custom\") pod \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\" (UID: \"cdac4fb3-a888-4781-b1e0-99630c84fe0f\") " Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.890225 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdac4fb3-a888-4781-b1e0-99630c84fe0f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cdac4fb3-a888-4781-b1e0-99630c84fe0f" (UID: "cdac4fb3-a888-4781-b1e0-99630c84fe0f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.901512 4948 generic.go:334] "Generic (PLEG): container finished" podID="10997b06-2476-4c6c-865d-1e5927e75fac" containerID="c6364a91b688011f494239085545a963704364e17364c1672c50b66b56b55484" exitCode=0 Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.901586 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"10997b06-2476-4c6c-865d-1e5927e75fac","Type":"ContainerDied","Data":"c6364a91b688011f494239085545a963704364e17364c1672c50b66b56b55484"} Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.905976 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-scripts" (OuterVolumeSpecName: "scripts") pod "cdac4fb3-a888-4781-b1e0-99630c84fe0f" (UID: "cdac4fb3-a888-4781-b1e0-99630c84fe0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.906085 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cdac4fb3-a888-4781-b1e0-99630c84fe0f" (UID: "cdac4fb3-a888-4781-b1e0-99630c84fe0f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.907586 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.934242 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.958218 4948 generic.go:334] "Generic (PLEG): container finished" podID="6458efcd-4f47-46a1-92ab-3f1c77035cce" containerID="81a88ee925738bec54ae478b0412037a10331c49f88928f7e3a4b1b1fb31441f" exitCode=0 Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.969800 4948 scope.go:117] "RemoveContainer" containerID="0afd0a938ed89b11ef030bb238f7a633318fb28e22cf192b071cb5022da3b544" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.970013 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdac4fb3-a888-4781-b1e0-99630c84fe0f-kube-api-access-jz2cw" (OuterVolumeSpecName: "kube-api-access-jz2cw") pod "cdac4fb3-a888-4781-b1e0-99630c84fe0f" (UID: "cdac4fb3-a888-4781-b1e0-99630c84fe0f"). InnerVolumeSpecName "kube-api-access-jz2cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.970397 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.976930 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" path="/var/lib/kubelet/pods/4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1/volumes" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.977488 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c0787d1-2fd6-4c5c-8e07-44bcbab37320" path="/var/lib/kubelet/pods/9c0787d1-2fd6-4c5c-8e07-44bcbab37320/volumes" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.978095 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9360a48-9890-45f3-8fc3-551ba8c1521e" path="/var/lib/kubelet/pods/c9360a48-9890-45f3-8fc3-551ba8c1521e/volumes" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.979069 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa89602-ba65-4bc5-90d0-c91e6be39d1e" path="/var/lib/kubelet/pods/dfa89602-ba65-4bc5-90d0-c91e6be39d1e/volumes" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.979560 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed713933-db04-4fdc-805d-7306d1cf2ec3" path="/var/lib/kubelet/pods/ed713933-db04-4fdc-805d-7306d1cf2ec3/volumes" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.984618 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5dc382-7aaa-4191-8605-dd03299ca26d" path="/var/lib/kubelet/pods/ff5dc382-7aaa-4191-8605-dd03299ca26d/volumes" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.986334 4948 generic.go:334] "Generic (PLEG): container finished" podID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerID="2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265" exitCode=0 Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.986506 4948 generic.go:334] "Generic (PLEG): container finished" podID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerID="843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc" exitCode=0 Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.986621 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kgd58" Dec 04 18:00:34 crc kubenswrapper[4948]: I1204 18:00:34.989530 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.029504 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.030358 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6458efcd-4f47-46a1-92ab-3f1c77035cce","Type":"ContainerDied","Data":"81a88ee925738bec54ae478b0412037a10331c49f88928f7e3a4b1b1fb31441f"} Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.030415 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.030433 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-hnjl5" event={"ID":"3326569d-4475-4365-8d93-b2b1522b6f60","Type":"ContainerDied","Data":"0906b04e9303e652de258fded09bd6b6ebff496fc77ca9796ec94bb6c580e83b"} Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.030537 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cdac4fb3-a888-4781-b1e0-99630c84fe0f","Type":"ContainerDied","Data":"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265"} Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.030556 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cdac4fb3-a888-4781-b1e0-99630c84fe0f","Type":"ContainerDied","Data":"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc"} Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.030584 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cdac4fb3-a888-4781-b1e0-99630c84fe0f","Type":"ContainerDied","Data":"14825261de269435ce1738d5aee1d307e7f78eb8bc1ae29be0c071bdd46bd699"} Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.034736 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdac4fb3-a888-4781-b1e0-99630c84fe0f" (UID: "cdac4fb3-a888-4781-b1e0-99630c84fe0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.034912 4948 scope.go:117] "RemoveContainer" containerID="055149cc29d5a8a0fbd7d07c17f45bd048d5a2c834bca9a88e2e890d03daf20f" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.039211 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.045413 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.069315 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.094441 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz2cw\" (UniqueName: \"kubernetes.io/projected/cdac4fb3-a888-4781-b1e0-99630c84fe0f-kube-api-access-jz2cw\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.094646 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.094662 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.094671 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.094680 4948 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdac4fb3-a888-4781-b1e0-99630c84fe0f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.098106 4948 scope.go:117] "RemoveContainer" containerID="568a85ebd27324ca89fa5287b4f57dd6f466e6af123fbd8cabe5c1985c81771c" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.108568 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-hnjl5"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.128454 4948 scope.go:117] "RemoveContainer" containerID="d9a6ef11482121f4a842966dc02f23660d77b8786023f6691bf3ecec31db0c0c" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.134992 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-hnjl5"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.159672 4948 scope.go:117] "RemoveContainer" containerID="c91a77902e7ba5e05adfd3330e1a213f391e817ac078428b5350fe1e14dbe94b" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.165948 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.168652 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data" (OuterVolumeSpecName: "config-data") pod "cdac4fb3-a888-4781-b1e0-99630c84fe0f" (UID: "cdac4fb3-a888-4781-b1e0-99630c84fe0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.181698 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-kgd58"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.189235 4948 scope.go:117] "RemoveContainer" containerID="2e82af8d0ee65dbf61f13fbc7e3f43e88954a4a8f887ae20e99e967abdfd0b62" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.196020 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-combined-ca-bundle\") pod \"10997b06-2476-4c6c-865d-1e5927e75fac\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.196507 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"10997b06-2476-4c6c-865d-1e5927e75fac\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.196568 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-kolla-config\") pod \"10997b06-2476-4c6c-865d-1e5927e75fac\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.196917 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-default\") pod \"10997b06-2476-4c6c-865d-1e5927e75fac\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.196965 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-nova-novncproxy-tls-certs\") pod \"6458efcd-4f47-46a1-92ab-3f1c77035cce\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.196988 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpgg9\" (UniqueName: \"kubernetes.io/projected/6458efcd-4f47-46a1-92ab-3f1c77035cce-kube-api-access-hpgg9\") pod \"6458efcd-4f47-46a1-92ab-3f1c77035cce\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197026 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-galera-tls-certs\") pod \"10997b06-2476-4c6c-865d-1e5927e75fac\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197070 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-vencrypt-tls-certs\") pod \"6458efcd-4f47-46a1-92ab-3f1c77035cce\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197091 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-config-data\") pod \"89ecb28d-b878-4b16-a46a-9d9be1441aca\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197114 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-log-httpd\") pod \"89ecb28d-b878-4b16-a46a-9d9be1441aca\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197359 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "10997b06-2476-4c6c-865d-1e5927e75fac" (UID: "10997b06-2476-4c6c-865d-1e5927e75fac"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197404 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "10997b06-2476-4c6c-865d-1e5927e75fac" (UID: "10997b06-2476-4c6c-865d-1e5927e75fac"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197465 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-generated\") pod \"10997b06-2476-4c6c-865d-1e5927e75fac\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197499 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-config-data\") pod \"6458efcd-4f47-46a1-92ab-3f1c77035cce\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197716 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-combined-ca-bundle\") pod \"6458efcd-4f47-46a1-92ab-3f1c77035cce\" (UID: \"6458efcd-4f47-46a1-92ab-3f1c77035cce\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197758 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-run-httpd\") pod \"89ecb28d-b878-4b16-a46a-9d9be1441aca\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197802 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-operator-scripts\") pod \"10997b06-2476-4c6c-865d-1e5927e75fac\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197828 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rc78\" (UniqueName: \"kubernetes.io/projected/10997b06-2476-4c6c-865d-1e5927e75fac-kube-api-access-7rc78\") pod \"10997b06-2476-4c6c-865d-1e5927e75fac\" (UID: \"10997b06-2476-4c6c-865d-1e5927e75fac\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.197860 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-combined-ca-bundle\") pod \"89ecb28d-b878-4b16-a46a-9d9be1441aca\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.198104 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "89ecb28d-b878-4b16-a46a-9d9be1441aca" (UID: "89ecb28d-b878-4b16-a46a-9d9be1441aca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.198271 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "89ecb28d-b878-4b16-a46a-9d9be1441aca" (UID: "89ecb28d-b878-4b16-a46a-9d9be1441aca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.198812 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdac4fb3-a888-4781-b1e0-99630c84fe0f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.198829 4948 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.198840 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.198868 4948 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.198877 4948 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/89ecb28d-b878-4b16-a46a-9d9be1441aca-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.199953 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "10997b06-2476-4c6c-865d-1e5927e75fac" (UID: "10997b06-2476-4c6c-865d-1e5927e75fac"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.200002 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10997b06-2476-4c6c-865d-1e5927e75fac" (UID: "10997b06-2476-4c6c-865d-1e5927e75fac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.211860 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10997b06-2476-4c6c-865d-1e5927e75fac-kube-api-access-7rc78" (OuterVolumeSpecName: "kube-api-access-7rc78") pod "10997b06-2476-4c6c-865d-1e5927e75fac" (UID: "10997b06-2476-4c6c-865d-1e5927e75fac"). InnerVolumeSpecName "kube-api-access-7rc78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.211923 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-kgd58"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.214936 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6458efcd-4f47-46a1-92ab-3f1c77035cce-kube-api-access-hpgg9" (OuterVolumeSpecName: "kube-api-access-hpgg9") pod "6458efcd-4f47-46a1-92ab-3f1c77035cce" (UID: "6458efcd-4f47-46a1-92ab-3f1c77035cce"). InnerVolumeSpecName "kube-api-access-hpgg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.224389 4948 scope.go:117] "RemoveContainer" containerID="2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.245071 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "10997b06-2476-4c6c-865d-1e5927e75fac" (UID: "10997b06-2476-4c6c-865d-1e5927e75fac"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.245988 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10997b06-2476-4c6c-865d-1e5927e75fac" (UID: "10997b06-2476-4c6c-865d-1e5927e75fac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.268521 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-config-data" (OuterVolumeSpecName: "config-data") pod "6458efcd-4f47-46a1-92ab-3f1c77035cce" (UID: "6458efcd-4f47-46a1-92ab-3f1c77035cce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.274957 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6458efcd-4f47-46a1-92ab-3f1c77035cce" (UID: "6458efcd-4f47-46a1-92ab-3f1c77035cce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.288974 4948 scope.go:117] "RemoveContainer" containerID="843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.304922 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-internal-tls-certs\") pod \"89ecb28d-b878-4b16-a46a-9d9be1441aca\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.304970 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-public-tls-certs\") pod \"89ecb28d-b878-4b16-a46a-9d9be1441aca\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.305173 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-etc-swift\") pod \"89ecb28d-b878-4b16-a46a-9d9be1441aca\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.305207 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpds8\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-kube-api-access-rpds8\") pod \"89ecb28d-b878-4b16-a46a-9d9be1441aca\" (UID: \"89ecb28d-b878-4b16-a46a-9d9be1441aca\") " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.306325 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10997b06-2476-4c6c-865d-1e5927e75fac-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.306351 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rc78\" (UniqueName: \"kubernetes.io/projected/10997b06-2476-4c6c-865d-1e5927e75fac-kube-api-access-7rc78\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.306367 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.306393 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.306406 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpgg9\" (UniqueName: \"kubernetes.io/projected/6458efcd-4f47-46a1-92ab-3f1c77035cce-kube-api-access-hpgg9\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.306420 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/10997b06-2476-4c6c-865d-1e5927e75fac-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.306445 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.306458 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.308170 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "10997b06-2476-4c6c-865d-1e5927e75fac" (UID: "10997b06-2476-4c6c-865d-1e5927e75fac"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.309629 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "89ecb28d-b878-4b16-a46a-9d9be1441aca" (UID: "89ecb28d-b878-4b16-a46a-9d9be1441aca"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.314540 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-kube-api-access-rpds8" (OuterVolumeSpecName: "kube-api-access-rpds8") pod "89ecb28d-b878-4b16-a46a-9d9be1441aca" (UID: "89ecb28d-b878-4b16-a46a-9d9be1441aca"). InnerVolumeSpecName "kube-api-access-rpds8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.361227 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.368889 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89ecb28d-b878-4b16-a46a-9d9be1441aca" (UID: "89ecb28d-b878-4b16-a46a-9d9be1441aca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.370930 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "6458efcd-4f47-46a1-92ab-3f1c77035cce" (UID: "6458efcd-4f47-46a1-92ab-3f1c77035cce"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.406527 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-config-data" (OuterVolumeSpecName: "config-data") pod "89ecb28d-b878-4b16-a46a-9d9be1441aca" (UID: "89ecb28d-b878-4b16-a46a-9d9be1441aca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.407918 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.407943 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.407954 4948 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.407966 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpds8\" (UniqueName: \"kubernetes.io/projected/89ecb28d-b878-4b16-a46a-9d9be1441aca-kube-api-access-rpds8\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.407979 4948 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/10997b06-2476-4c6c-865d-1e5927e75fac-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.407991 4948 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.408001 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.409148 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanaee8-account-delete-r5gkz"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.439325 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder20a5-account-delete-b2bnv"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.459103 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement7046-account-delete-d78kq"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.467008 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "89ecb28d-b878-4b16-a46a-9d9be1441aca" (UID: "89ecb28d-b878-4b16-a46a-9d9be1441aca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.471856 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "6458efcd-4f47-46a1-92ab-3f1c77035cce" (UID: "6458efcd-4f47-46a1-92ab-3f1c77035cce"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.493412 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1e276-account-delete-5xw8n"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.501642 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "89ecb28d-b878-4b16-a46a-9d9be1441aca" (UID: "89ecb28d-b878-4b16-a46a-9d9be1441aca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.508683 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapide93-account-delete-s9wkh"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.509775 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.509800 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ecb28d-b878-4b16-a46a-9d9be1441aca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.509812 4948 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6458efcd-4f47-46a1-92ab-3f1c77035cce-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.524912 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance47ce-account-delete-9mwp2"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.534400 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron4bb1-account-delete-4fsjg"] Dec 04 18:00:35 crc kubenswrapper[4948]: E1204 18:00:35.540196 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467 is running failed: container process not found" containerID="4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 18:00:35 crc kubenswrapper[4948]: E1204 18:00:35.542336 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467 is running failed: container process not found" containerID="4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 18:00:35 crc kubenswrapper[4948]: W1204 18:00:35.542499 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d563e18_b478_40af_b4c6_b2dd89ea863a.slice/crio-5aaf8d1d51e73c6ee336638463fec279588906f1fe784b96badda2d3faa37666 WatchSource:0}: Error finding container 5aaf8d1d51e73c6ee336638463fec279588906f1fe784b96badda2d3faa37666: Status 404 returned error can't find the container with id 5aaf8d1d51e73c6ee336638463fec279588906f1fe784b96badda2d3faa37666 Dec 04 18:00:35 crc kubenswrapper[4948]: E1204 18:00:35.543505 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467 is running failed: container process not found" containerID="4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 18:00:35 crc kubenswrapper[4948]: E1204 18:00:35.543534 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="e318bac5-87da-4a9b-9d73-8065c65f4b61" containerName="nova-cell1-conductor-conductor" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.554312 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0a2da-account-delete-2tst9"] Dec 04 18:00:35 crc kubenswrapper[4948]: W1204 18:00:35.558414 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48d8f605_3274_40ec_8a30_8dc188fdcd86.slice/crio-4311ac794bbba6c28715202d1fa2559048c4ea66aa409b3ff1f8b56f625021bb WatchSource:0}: Error finding container 4311ac794bbba6c28715202d1fa2559048c4ea66aa409b3ff1f8b56f625021bb: Status 404 returned error can't find the container with id 4311ac794bbba6c28715202d1fa2559048c4ea66aa409b3ff1f8b56f625021bb Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.610326 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.613592 4948 scope.go:117] "RemoveContainer" containerID="2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265" Dec 04 18:00:35 crc kubenswrapper[4948]: E1204 18:00:35.614276 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265\": container with ID starting with 2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265 not found: ID does not exist" containerID="2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.614337 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265"} err="failed to get container status \"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265\": rpc error: code = NotFound desc = could not find container \"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265\": container with ID starting with 2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265 not found: ID does not exist" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.614372 4948 scope.go:117] "RemoveContainer" containerID="843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc" Dec 04 18:00:35 crc kubenswrapper[4948]: E1204 18:00:35.615332 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc\": container with ID starting with 843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc not found: ID does not exist" containerID="843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.615389 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc"} err="failed to get container status \"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc\": rpc error: code = NotFound desc = could not find container \"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc\": container with ID starting with 843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc not found: ID does not exist" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.615407 4948 scope.go:117] "RemoveContainer" containerID="2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.617593 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265"} err="failed to get container status \"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265\": rpc error: code = NotFound desc = could not find container \"2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265\": container with ID starting with 2f02d20f20eaadf16f3c87639b15c4ee1750f9d35ce7cb91f48289ea8005f265 not found: ID does not exist" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.617655 4948 scope.go:117] "RemoveContainer" containerID="843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc" Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.617954 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc"} err="failed to get container status \"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc\": rpc error: code = NotFound desc = could not find container \"843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc\": container with ID starting with 843777531acb45003f3bd7d592822db0bb98c9db3ae61021773ed8900b4b3ddc not found: ID does not exist" Dec 04 18:00:35 crc kubenswrapper[4948]: E1204 18:00:35.620741 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 18:00:35 crc kubenswrapper[4948]: E1204 18:00:35.620812 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data podName:90b4baf7-8366-4f47-8515-c33e1b691856 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:39.620797153 +0000 UTC m=+2050.981871555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data") pod "rabbitmq-server-0" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856") : configmap "rabbitmq-config-data" not found Dec 04 18:00:35 crc kubenswrapper[4948]: W1204 18:00:35.622580 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1b64e38_8be0_41af_bf89_878d17bbd7a5.slice/crio-df6e49ae2bdee362d0ade5799fa7b2e79b9b2af3709ed1474bfd46451a2158d4 WatchSource:0}: Error finding container df6e49ae2bdee362d0ade5799fa7b2e79b9b2af3709ed1474bfd46451a2158d4: Status 404 returned error can't find the container with id df6e49ae2bdee362d0ade5799fa7b2e79b9b2af3709ed1474bfd46451a2158d4 Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.640164 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 18:00:35 crc kubenswrapper[4948]: I1204 18:00:35.856014 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.021209 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6458efcd-4f47-46a1-92ab-3f1c77035cce","Type":"ContainerDied","Data":"61521069d25143e3b65a5c27570a323145f5f615e05b0c63d50d32acb4997549"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.021559 4948 scope.go:117] "RemoveContainer" containerID="81a88ee925738bec54ae478b0412037a10331c49f88928f7e3a4b1b1fb31441f" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.021739 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.030988 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlsqw\" (UniqueName: \"kubernetes.io/projected/e318bac5-87da-4a9b-9d73-8065c65f4b61-kube-api-access-nlsqw\") pod \"e318bac5-87da-4a9b-9d73-8065c65f4b61\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.031144 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-config-data\") pod \"e318bac5-87da-4a9b-9d73-8065c65f4b61\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.031222 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-combined-ca-bundle\") pod \"e318bac5-87da-4a9b-9d73-8065c65f4b61\" (UID: \"e318bac5-87da-4a9b-9d73-8065c65f4b61\") " Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.039151 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" event={"ID":"89ecb28d-b878-4b16-a46a-9d9be1441aca","Type":"ContainerDied","Data":"0982a9d289d1595dfc873dce3bf4796b152b1f5667880fda83ced6877fcbfd94"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.039250 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-79fbd4d98c-8tdt7" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.041535 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e318bac5-87da-4a9b-9d73-8065c65f4b61-kube-api-access-nlsqw" (OuterVolumeSpecName: "kube-api-access-nlsqw") pod "e318bac5-87da-4a9b-9d73-8065c65f4b61" (UID: "e318bac5-87da-4a9b-9d73-8065c65f4b61"). InnerVolumeSpecName "kube-api-access-nlsqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.044196 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1e276-account-delete-5xw8n" event={"ID":"4d563e18-b478-40af-b4c6-b2dd89ea863a","Type":"ContainerStarted","Data":"5aaf8d1d51e73c6ee336638463fec279588906f1fe784b96badda2d3faa37666"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.047029 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapide93-account-delete-s9wkh" event={"ID":"48d8f605-3274-40ec-8a30-8dc188fdcd86","Type":"ContainerStarted","Data":"4311ac794bbba6c28715202d1fa2559048c4ea66aa409b3ff1f8b56f625021bb"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.058561 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanaee8-account-delete-r5gkz" event={"ID":"9acee6d3-23af-4793-8e56-8f3fbc169779","Type":"ContainerStarted","Data":"ee2f0fda51ef8d33013ab45223908220da2e917552a17842ef747e4792ebb736"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.058613 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanaee8-account-delete-r5gkz" event={"ID":"9acee6d3-23af-4793-8e56-8f3fbc169779","Type":"ContainerStarted","Data":"3ba5a2d6c81678c189f381dbb956b8fca5e299a07868c418916fb3e924c06e2a"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.075956 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron4bb1-account-delete-4fsjg" event={"ID":"59806891-9fa2-446a-87c1-b7efbf4b692b","Type":"ContainerStarted","Data":"403bba4c9b8c6be6f89387f93ad9236f22b8236dff6af9a494652ac5021e7856"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.086985 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbicanaee8-account-delete-r5gkz" podStartSLOduration=5.086965744 podStartE2EDuration="5.086965744s" podCreationTimestamp="2025-12-04 18:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 18:00:36.081749337 +0000 UTC m=+2047.442823739" watchObservedRunningTime="2025-12-04 18:00:36.086965744 +0000 UTC m=+2047.448040146" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.092613 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder20a5-account-delete-b2bnv" event={"ID":"31e5cc30-bac1-418c-af51-af5cb1d8d595","Type":"ContainerStarted","Data":"d311eed4082742e9e355378e89115c1568f4c2513b74e4ce4b82dfb716efc259"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.112335 4948 generic.go:334] "Generic (PLEG): container finished" podID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerID="db8b9187d0c187cfc911c618a1e41befbb49ce369abd91c26b5274db741964ad" exitCode=0 Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.112428 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c08574c-af0f-4e7c-81af-b180b29ce4ee","Type":"ContainerDied","Data":"db8b9187d0c187cfc911c618a1e41befbb49ce369abd91c26b5274db741964ad"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.123146 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-79fbd4d98c-8tdt7"] Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.131220 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-79fbd4d98c-8tdt7"] Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.133744 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-config-data" (OuterVolumeSpecName: "config-data") pod "e318bac5-87da-4a9b-9d73-8065c65f4b61" (UID: "e318bac5-87da-4a9b-9d73-8065c65f4b61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.138388 4948 generic.go:334] "Generic (PLEG): container finished" podID="e318bac5-87da-4a9b-9d73-8065c65f4b61" containerID="4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467" exitCode=0 Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.138710 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.138893 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e318bac5-87da-4a9b-9d73-8065c65f4b61" (UID: "e318bac5-87da-4a9b-9d73-8065c65f4b61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.139006 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e318bac5-87da-4a9b-9d73-8065c65f4b61","Type":"ContainerDied","Data":"4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.139115 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e318bac5-87da-4a9b-9d73-8065c65f4b61","Type":"ContainerDied","Data":"9e35f40a85bf47055c35b75f9d74b4fb908aaf754206e65e733571703469c199"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.141665 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlsqw\" (UniqueName: \"kubernetes.io/projected/e318bac5-87da-4a9b-9d73-8065c65f4b61-kube-api-access-nlsqw\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.142542 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.142623 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e318bac5-87da-4a9b-9d73-8065c65f4b61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.149237 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement7046-account-delete-d78kq" event={"ID":"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591","Type":"ContainerStarted","Data":"5b30dec60cadf6795af400b00340ae441c03abdc9d849bec81f8c28d4adf020f"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.150293 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 18:00:36 crc kubenswrapper[4948]: E1204 18:00:36.154027 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 18:00:36 crc kubenswrapper[4948]: E1204 18:00:36.157884 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 18:00:36 crc kubenswrapper[4948]: E1204 18:00:36.163000 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 18:00:36 crc kubenswrapper[4948]: E1204 18:00:36.163052 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="214441b7-69b1-4518-a135-73de11d39a1d" containerName="nova-cell0-conductor-conductor" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.169934 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance47ce-account-delete-9mwp2" event={"ID":"80f9ff11-f145-4e76-a9fc-084de8ccb029","Type":"ContainerStarted","Data":"979c7fd0ffb2f20249d54d4e390608046276ce5bb279de1d81fdd19a936f1f77"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.169980 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.170887 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder20a5-account-delete-b2bnv" podStartSLOduration=5.170865703 podStartE2EDuration="5.170865703s" podCreationTimestamp="2025-12-04 18:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 18:00:36.133964656 +0000 UTC m=+2047.495039068" watchObservedRunningTime="2025-12-04 18:00:36.170865703 +0000 UTC m=+2047.531940105" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.181927 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0a2da-account-delete-2tst9" event={"ID":"e1b64e38-8be0-41af-bf89-878d17bbd7a5","Type":"ContainerStarted","Data":"df6e49ae2bdee362d0ade5799fa7b2e79b9b2af3709ed1474bfd46451a2158d4"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.190301 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"10997b06-2476-4c6c-865d-1e5927e75fac","Type":"ContainerDied","Data":"2ddc3499f14dc2b4545cd0e2d863bced0a2ebfc46c6eef3fc72c288c305cd155"} Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.190413 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.195883 4948 scope.go:117] "RemoveContainer" containerID="a43a2bd6b5a97a2d0c5ce373003143582211cc9d3bf7982dbc10ceb659d870a6" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.342798 4948 scope.go:117] "RemoveContainer" containerID="ab060077462d8ce9f643db68a3d7c266453bc9728c23786c15ef088ebea997bf" Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.698123 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.698822 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="ceilometer-central-agent" containerID="cri-o://cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a" gracePeriod=30 Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.700394 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="proxy-httpd" containerID="cri-o://f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54" gracePeriod=30 Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.700574 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="ceilometer-notification-agent" containerID="cri-o://62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b" gracePeriod=30 Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.700617 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="sg-core" containerID="cri-o://a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f" gracePeriod=30 Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.732260 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.732565 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" containerName="kube-state-metrics" containerID="cri-o://bf58486ad5fb8f9b6caf20551fac4a932d36fd2cc04ef2bf024f6aa264c91c35" gracePeriod=30 Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.879876 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 04 18:00:36 crc kubenswrapper[4948]: I1204 18:00:36.880102 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="fce6fe82-2dcb-49cd-851a-446e66038965" containerName="memcached" containerID="cri-o://ca9cbacbe11dab0d67e4befee314eff5c109090f17e6413c4b39e087ea4c3f46" gracePeriod=30 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:36.910576 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d5f54fb74-68pcc" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:38682->10.217.0.158:9311: read: connection reset by peer" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:36.910584 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d5f54fb74-68pcc" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:38680->10.217.0.158:9311: read: connection reset by peer" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.073962 4948 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/openstack-galera-0" secret="" err="secret \"galera-openstack-dockercfg-9d9sk\" not found" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.216755 4948 generic.go:334] "Generic (PLEG): container finished" podID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerID="228923eb8a21101983b2bce76096ee36269db4118b9d4b8fa58a9ef47c3110a3" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.230790 4948 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.230863 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:37.730840851 +0000 UTC m=+2049.091915353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-config-data" not found Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.232409 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.232460 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:37.732444193 +0000 UTC m=+2049.093518595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-scripts" not found Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.232648 4948 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.232680 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:37.732668879 +0000 UTC m=+2049.093743281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-config-data" not found Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.238619 4948 generic.go:334] "Generic (PLEG): container finished" podID="9acee6d3-23af-4793-8e56-8f3fbc169779" containerID="ee2f0fda51ef8d33013ab45223908220da2e917552a17842ef747e4792ebb736" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.260213 4948 generic.go:334] "Generic (PLEG): container finished" podID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerID="3ed5978b64fee059b95b3f3fcb1a1ab665b53aab15fb25269bdf21eeb866ef81" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.264074 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" path="/var/lib/kubelet/pods/0cc3ac35-04df-4516-8623-b6a0d855c98a/volumes" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.264832 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3326569d-4475-4365-8d93-b2b1522b6f60" path="/var/lib/kubelet/pods/3326569d-4475-4365-8d93-b2b1522b6f60/volumes" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.265554 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6458efcd-4f47-46a1-92ab-3f1c77035cce" path="/var/lib/kubelet/pods/6458efcd-4f47-46a1-92ab-3f1c77035cce/volumes" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.268079 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ae0228-b131-4cec-a52f-b5786c22355c" path="/var/lib/kubelet/pods/64ae0228-b131-4cec-a52f-b5786c22355c/volumes" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.272107 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" path="/var/lib/kubelet/pods/6840a402-94d3-48e6-9ccb-d578573e430a/volumes" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.274853 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" path="/var/lib/kubelet/pods/89ecb28d-b878-4b16-a46a-9d9be1441aca/volumes" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.275698 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" path="/var/lib/kubelet/pods/cdac4fb3-a888-4781-b1e0-99630c84fe0f/volumes" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277153 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d56c8fbdd-fr7fc" event={"ID":"117c809e-76fd-458e-acbf-e2f6ce2d2f43","Type":"ContainerDied","Data":"228923eb8a21101983b2bce76096ee36269db4118b9d4b8fa58a9ef47c3110a3"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277189 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2v5d6"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277208 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rjcpf"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277222 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanaee8-account-delete-r5gkz" event={"ID":"9acee6d3-23af-4793-8e56-8f3fbc169779","Type":"ContainerDied","Data":"ee2f0fda51ef8d33013ab45223908220da2e917552a17842ef747e4792ebb736"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277242 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rjcpf"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277263 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2v5d6"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277275 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-54df7858f8-fz456"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277294 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c881bee3-e2f3-4da4-a12f-00db430e4323","Type":"ContainerDied","Data":"3ed5978b64fee059b95b3f3fcb1a1ab665b53aab15fb25269bdf21eeb866ef81"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277312 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c881bee3-e2f3-4da4-a12f-00db430e4323","Type":"ContainerDied","Data":"7861ac1bfcdbf0b6ec2259eab46c686bd03071987d27e20ad8f3c96fb090246f"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277325 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7861ac1bfcdbf0b6ec2259eab46c686bd03071987d27e20ad8f3c96fb090246f" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277337 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277351 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-t8qtr"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277364 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-t8qtr"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277377 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c83f-account-create-update-kx2n6"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.277389 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c83f-account-create-update-kx2n6"] Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.278884 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-54df7858f8-fz456" podUID="fb168081-824d-45ef-a815-b96d44b58b7c" containerName="keystone-api" containerID="cri-o://42f8d6eb61951e718daf3a1c3876ae5812998b0025aa93db9e52773f8f045f77" gracePeriod=30 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.290281 4948 generic.go:334] "Generic (PLEG): container finished" podID="31e5cc30-bac1-418c-af51-af5cb1d8d595" containerID="aa1b78c1f482914f1113d2ebd5a93ee7b90f349e894f99d3d833add1e7595f33" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.290608 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder20a5-account-delete-b2bnv" event={"ID":"31e5cc30-bac1-418c-af51-af5cb1d8d595","Type":"ContainerDied","Data":"aa1b78c1f482914f1113d2ebd5a93ee7b90f349e894f99d3d833add1e7595f33"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.313171 4948 generic.go:334] "Generic (PLEG): container finished" podID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerID="f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.313201 4948 generic.go:334] "Generic (PLEG): container finished" podID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerID="a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f" exitCode=2 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.313244 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerDied","Data":"f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.313268 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerDied","Data":"a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.328411 4948 generic.go:334] "Generic (PLEG): container finished" podID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerID="f3c7b7339517046484e3d5e33d506a76290c1be3ff41874ab17e7a9348fa892a" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.328470 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60b408db-1dec-49e0-8212-1193d4fe6a37","Type":"ContainerDied","Data":"f3c7b7339517046484e3d5e33d506a76290c1be3ff41874ab17e7a9348fa892a"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.339079 4948 generic.go:334] "Generic (PLEG): container finished" podID="4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" containerID="bf58486ad5fb8f9b6caf20551fac4a932d36fd2cc04ef2bf024f6aa264c91c35" exitCode=2 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.339281 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc","Type":"ContainerDied","Data":"bf58486ad5fb8f9b6caf20551fac4a932d36fd2cc04ef2bf024f6aa264c91c35"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.343111 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c08574c-af0f-4e7c-81af-b180b29ce4ee","Type":"ContainerDied","Data":"913bf6bea6ca98aae7f0b32a0e3216f8310c0c695066ffe8b7e004fd33427427"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.343228 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="913bf6bea6ca98aae7f0b32a0e3216f8310c0c695066ffe8b7e004fd33427427" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.345548 4948 generic.go:334] "Generic (PLEG): container finished" podID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerID="6b974c2237fa21ad81c9fc52a94fd34f294e0f3775cb43b085b2410092be36d4" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.345635 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d5f54fb74-68pcc" event={"ID":"be3e0d09-a01a-4f1c-9fbd-60a23a823e31","Type":"ContainerDied","Data":"6b974c2237fa21ad81c9fc52a94fd34f294e0f3775cb43b085b2410092be36d4"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.347154 4948 generic.go:334] "Generic (PLEG): container finished" podID="59806891-9fa2-446a-87c1-b7efbf4b692b" containerID="7872c1f6fd42a0803b14be61d6958d7a38a6d0fa6968c58defce7378683ea1cc" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.347215 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron4bb1-account-delete-4fsjg" event={"ID":"59806891-9fa2-446a-87c1-b7efbf4b692b","Type":"ContainerDied","Data":"7872c1f6fd42a0803b14be61d6958d7a38a6d0fa6968c58defce7378683ea1cc"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.348877 4948 generic.go:334] "Generic (PLEG): container finished" podID="80f9ff11-f145-4e76-a9fc-084de8ccb029" containerID="41105a549e6c1574bdd99b2d5b68e9a2cf1e4860742e66d69d7fe198180e35dd" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.348932 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance47ce-account-delete-9mwp2" event={"ID":"80f9ff11-f145-4e76-a9fc-084de8ccb029","Type":"ContainerDied","Data":"41105a549e6c1574bdd99b2d5b68e9a2cf1e4860742e66d69d7fe198180e35dd"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.350929 4948 generic.go:334] "Generic (PLEG): container finished" podID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerID="12e734767396eb518b40a349afac5356ca256b1870d63b32a4acf4a594db67b0" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.351125 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3","Type":"ContainerDied","Data":"12e734767396eb518b40a349afac5356ca256b1870d63b32a4acf4a594db67b0"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.352725 4948 generic.go:334] "Generic (PLEG): container finished" podID="48d8f605-3274-40ec-8a30-8dc188fdcd86" containerID="79e062b08955842c2dc531636633118d7f543daf41922d6d8d1ad534ae81a544" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.352875 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapide93-account-delete-s9wkh" event={"ID":"48d8f605-3274-40ec-8a30-8dc188fdcd86","Type":"ContainerDied","Data":"79e062b08955842c2dc531636633118d7f543daf41922d6d8d1ad534ae81a544"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.354948 4948 generic.go:334] "Generic (PLEG): container finished" podID="fce6fe82-2dcb-49cd-851a-446e66038965" containerID="ca9cbacbe11dab0d67e4befee314eff5c109090f17e6413c4b39e087ea4c3f46" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.354992 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fce6fe82-2dcb-49cd-851a-446e66038965","Type":"ContainerDied","Data":"ca9cbacbe11dab0d67e4befee314eff5c109090f17e6413c4b39e087ea4c3f46"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.361760 4948 generic.go:334] "Generic (PLEG): container finished" podID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerID="637497b65838d4e1875162878d30bf8895cfbdd36b9fd9f4596de491cb8f3761" exitCode=0 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.361951 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd","Type":"ContainerDied","Data":"637497b65838d4e1875162878d30bf8895cfbdd36b9fd9f4596de491cb8f3761"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.361983 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd","Type":"ContainerDied","Data":"fb6b9e12c3fdd67fa4143d373ae90ebd6b9aafd6949f02e8e590d78df808ea76"} Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.362049 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb6b9e12c3fdd67fa4143d373ae90ebd6b9aafd6949f02e8e590d78df808ea76" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.423492 4948 scope.go:117] "RemoveContainer" containerID="1796ae814600a948a208a111d68114ac6381992376b853c84c0d4f878894e203" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.634948 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="27244fac-7ff8-4ca0-9002-ef85f78a2564" containerName="galera" containerID="cri-o://6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163" gracePeriod=30 Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.723665 4948 scope.go:117] "RemoveContainer" containerID="4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.724211 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.753981 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5b7z\" (UniqueName: \"kubernetes.io/projected/0c08574c-af0f-4e7c-81af-b180b29ce4ee-kube-api-access-z5b7z\") pod \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.754018 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-combined-ca-bundle\") pod \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.754057 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-scripts\") pod \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.754135 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.754160 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-public-tls-certs\") pod \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.754186 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-httpd-run\") pod \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.754217 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-config-data\") pod \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.754235 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-logs\") pod \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\" (UID: \"0c08574c-af0f-4e7c-81af-b180b29ce4ee\") " Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.754613 4948 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.754661 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:38.754646622 +0000 UTC m=+2050.115721024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-config-data" not found Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.763020 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0c08574c-af0f-4e7c-81af-b180b29ce4ee" (UID: "0c08574c-af0f-4e7c-81af-b180b29ce4ee"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.763080 4948 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.763141 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:38.763120714 +0000 UTC m=+2050.124195186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-config-data" not found Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.769883 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-logs" (OuterVolumeSpecName: "logs") pod "0c08574c-af0f-4e7c-81af-b180b29ce4ee" (UID: "0c08574c-af0f-4e7c-81af-b180b29ce4ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.770305 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.770384 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:38.770345854 +0000 UTC m=+2050.131420256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-scripts" not found Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.770410 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.788628 4948 scope.go:117] "RemoveContainer" containerID="4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467" Dec 04 18:00:37 crc kubenswrapper[4948]: E1204 18:00:37.789260 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467\": container with ID starting with 4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467 not found: ID does not exist" containerID="4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.789301 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467"} err="failed to get container status \"4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467\": rpc error: code = NotFound desc = could not find container \"4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467\": container with ID starting with 4cdeabb5ba7b6305a429bd9f05d6c6573f19121345fff8ef1cd437f5bb8cf467 not found: ID does not exist" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.789326 4948 scope.go:117] "RemoveContainer" containerID="c6364a91b688011f494239085545a963704364e17364c1672c50b66b56b55484" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.803122 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "0c08574c-af0f-4e7c-81af-b180b29ce4ee" (UID: "0c08574c-af0f-4e7c-81af-b180b29ce4ee"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.804449 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-scripts" (OuterVolumeSpecName: "scripts") pod "0c08574c-af0f-4e7c-81af-b180b29ce4ee" (UID: "0c08574c-af0f-4e7c-81af-b180b29ce4ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.813091 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c08574c-af0f-4e7c-81af-b180b29ce4ee-kube-api-access-z5b7z" (OuterVolumeSpecName: "kube-api-access-z5b7z") pod "0c08574c-af0f-4e7c-81af-b180b29ce4ee" (UID: "0c08574c-af0f-4e7c-81af-b180b29ce4ee"). InnerVolumeSpecName "kube-api-access-z5b7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.824012 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.826291 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.827528 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.828895 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.845232 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.856857 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5b7z\" (UniqueName: \"kubernetes.io/projected/0c08574c-af0f-4e7c-81af-b180b29ce4ee-kube-api-access-z5b7z\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.856892 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.856915 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.859728 4948 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.859779 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c08574c-af0f-4e7c-81af-b180b29ce4ee-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.871973 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.873209 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.908583 4948 scope.go:117] "RemoveContainer" containerID="e8ee824e9c19c8047c179bbf5ecf993388e37bf34793e8731035445c781ff3b5" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.961769 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brx8x\" (UniqueName: \"kubernetes.io/projected/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-kube-api-access-brx8x\") pod \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962128 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-combined-ca-bundle\") pod \"c881bee3-e2f3-4da4-a12f-00db430e4323\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962156 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-logs\") pod \"c881bee3-e2f3-4da4-a12f-00db430e4323\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962180 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-nova-metadata-tls-certs\") pod \"60b408db-1dec-49e0-8212-1193d4fe6a37\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962208 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data-custom\") pod \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962238 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szdnk\" (UniqueName: \"kubernetes.io/projected/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-api-access-szdnk\") pod \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962264 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b408db-1dec-49e0-8212-1193d4fe6a37-logs\") pod \"60b408db-1dec-49e0-8212-1193d4fe6a37\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962291 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-kolla-config\") pod \"fce6fe82-2dcb-49cd-851a-446e66038965\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962309 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-config\") pod \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962327 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c881bee3-e2f3-4da4-a12f-00db430e4323\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962343 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-config-data\") pod \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962365 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7k4t\" (UniqueName: \"kubernetes.io/projected/c881bee3-e2f3-4da4-a12f-00db430e4323-kube-api-access-j7k4t\") pod \"c881bee3-e2f3-4da4-a12f-00db430e4323\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962382 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-scripts\") pod \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962399 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pwcc\" (UniqueName: \"kubernetes.io/projected/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-kube-api-access-8pwcc\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962427 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-internal-tls-certs\") pod \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962448 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-etc-machine-id\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962466 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962482 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-public-tls-certs\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962504 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-config-data\") pod \"60b408db-1dec-49e0-8212-1193d4fe6a37\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962523 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data-custom\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962539 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-combined-ca-bundle\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962556 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-logs\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962571 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-memcached-tls-certs\") pod \"fce6fe82-2dcb-49cd-851a-446e66038965\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962591 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-scripts\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962606 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-combined-ca-bundle\") pod \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962621 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw92t\" (UniqueName: \"kubernetes.io/projected/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-kube-api-access-vw92t\") pod \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962638 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-public-tls-certs\") pod \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962655 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c809e-76fd-458e-acbf-e2f6ce2d2f43-logs\") pod \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962676 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-internal-tls-certs\") pod \"c881bee3-e2f3-4da4-a12f-00db430e4323\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962696 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-certs\") pod \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962722 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-logs\") pod \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962740 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-internal-tls-certs\") pod \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962763 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962767 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-public-tls-certs\") pod \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962872 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n55k7\" (UniqueName: \"kubernetes.io/projected/fce6fe82-2dcb-49cd-851a-446e66038965-kube-api-access-n55k7\") pod \"fce6fe82-2dcb-49cd-851a-446e66038965\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962902 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q28n9\" (UniqueName: \"kubernetes.io/projected/117c809e-76fd-458e-acbf-e2f6ce2d2f43-kube-api-access-q28n9\") pod \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962927 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flbnc\" (UniqueName: \"kubernetes.io/projected/60b408db-1dec-49e0-8212-1193d4fe6a37-kube-api-access-flbnc\") pod \"60b408db-1dec-49e0-8212-1193d4fe6a37\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962951 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-combined-ca-bundle\") pod \"60b408db-1dec-49e0-8212-1193d4fe6a37\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962974 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-combined-ca-bundle\") pod \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\" (UID: \"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.962998 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-httpd-run\") pod \"c881bee3-e2f3-4da4-a12f-00db430e4323\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963017 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-scripts\") pod \"c881bee3-e2f3-4da4-a12f-00db430e4323\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963054 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-internal-tls-certs\") pod \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\" (UID: \"dfdde2fd-5c98-4b6f-b9a5-a746a454fafd\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963073 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-config-data\") pod \"fce6fe82-2dcb-49cd-851a-446e66038965\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963095 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-config-data\") pod \"c881bee3-e2f3-4da4-a12f-00db430e4323\" (UID: \"c881bee3-e2f3-4da4-a12f-00db430e4323\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963119 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-logs\") pod \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963147 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-internal-tls-certs\") pod \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963168 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-public-tls-certs\") pod \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963194 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-combined-ca-bundle\") pod \"fce6fe82-2dcb-49cd-851a-446e66038965\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963217 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-combined-ca-bundle\") pod \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963246 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data\") pod \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\" (UID: \"be3e0d09-a01a-4f1c-9fbd-60a23a823e31\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963264 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-combined-ca-bundle\") pod \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\" (UID: \"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.963288 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-config-data\") pod \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\" (UID: \"117c809e-76fd-458e-acbf-e2f6ce2d2f43\") " Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.966494 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-logs" (OuterVolumeSpecName: "logs") pod "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" (UID: "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.966744 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-logs" (OuterVolumeSpecName: "logs") pod "c881bee3-e2f3-4da4-a12f-00db430e4323" (UID: "c881bee3-e2f3-4da4-a12f-00db430e4323"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.967009 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-logs" (OuterVolumeSpecName: "logs") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.967087 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.967108 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.967124 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.968847 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/117c809e-76fd-458e-acbf-e2f6ce2d2f43-logs" (OuterVolumeSpecName: "logs") pod "117c809e-76fd-458e-acbf-e2f6ce2d2f43" (UID: "117c809e-76fd-458e-acbf-e2f6ce2d2f43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.969803 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b408db-1dec-49e0-8212-1193d4fe6a37-logs" (OuterVolumeSpecName: "logs") pod "60b408db-1dec-49e0-8212-1193d4fe6a37" (UID: "60b408db-1dec-49e0-8212-1193d4fe6a37"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.970303 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "fce6fe82-2dcb-49cd-851a-446e66038965" (UID: "fce6fe82-2dcb-49cd-851a-446e66038965"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.971987 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c881bee3-e2f3-4da4-a12f-00db430e4323" (UID: "c881bee3-e2f3-4da4-a12f-00db430e4323"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.983914 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-kube-api-access-brx8x" (OuterVolumeSpecName: "kube-api-access-brx8x") pod "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" (UID: "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3"). InnerVolumeSpecName "kube-api-access-brx8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.990160 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.990940 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-logs" (OuterVolumeSpecName: "logs") pod "be3e0d09-a01a-4f1c-9fbd-60a23a823e31" (UID: "be3e0d09-a01a-4f1c-9fbd-60a23a823e31"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:37 crc kubenswrapper[4948]: I1204 18:00:37.993031 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-config-data" (OuterVolumeSpecName: "config-data") pod "fce6fe82-2dcb-49cd-851a-446e66038965" (UID: "fce6fe82-2dcb-49cd-851a-446e66038965"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.003775 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b408db-1dec-49e0-8212-1193d4fe6a37-kube-api-access-flbnc" (OuterVolumeSpecName: "kube-api-access-flbnc") pod "60b408db-1dec-49e0-8212-1193d4fe6a37" (UID: "60b408db-1dec-49e0-8212-1193d4fe6a37"). InnerVolumeSpecName "kube-api-access-flbnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.015248 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be3e0d09-a01a-4f1c-9fbd-60a23a823e31" (UID: "be3e0d09-a01a-4f1c-9fbd-60a23a823e31"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.015348 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-scripts" (OuterVolumeSpecName: "scripts") pod "117c809e-76fd-458e-acbf-e2f6ce2d2f43" (UID: "117c809e-76fd-458e-acbf-e2f6ce2d2f43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.015359 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-scripts" (OuterVolumeSpecName: "scripts") pod "c881bee3-e2f3-4da4-a12f-00db430e4323" (UID: "c881bee3-e2f3-4da4-a12f-00db430e4323"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.015502 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-kube-api-access-vw92t" (OuterVolumeSpecName: "kube-api-access-vw92t") pod "be3e0d09-a01a-4f1c-9fbd-60a23a823e31" (UID: "be3e0d09-a01a-4f1c-9fbd-60a23a823e31"). InnerVolumeSpecName "kube-api-access-vw92t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.015685 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce6fe82-2dcb-49cd-851a-446e66038965-kube-api-access-n55k7" (OuterVolumeSpecName: "kube-api-access-n55k7") pod "fce6fe82-2dcb-49cd-851a-446e66038965" (UID: "fce6fe82-2dcb-49cd-851a-446e66038965"). InnerVolumeSpecName "kube-api-access-n55k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.015806 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c881bee3-e2f3-4da4-a12f-00db430e4323-kube-api-access-j7k4t" (OuterVolumeSpecName: "kube-api-access-j7k4t") pod "c881bee3-e2f3-4da4-a12f-00db430e4323" (UID: "c881bee3-e2f3-4da4-a12f-00db430e4323"). InnerVolumeSpecName "kube-api-access-j7k4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.015966 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-api-access-szdnk" (OuterVolumeSpecName: "kube-api-access-szdnk") pod "4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" (UID: "4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc"). InnerVolumeSpecName "kube-api-access-szdnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.016183 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c881bee3-e2f3-4da4-a12f-00db430e4323" (UID: "c881bee3-e2f3-4da4-a12f-00db430e4323"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.017809 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-scripts" (OuterVolumeSpecName: "scripts") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.021296 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117c809e-76fd-458e-acbf-e2f6ce2d2f43-kube-api-access-q28n9" (OuterVolumeSpecName: "kube-api-access-q28n9") pod "117c809e-76fd-458e-acbf-e2f6ce2d2f43" (UID: "117c809e-76fd-458e-acbf-e2f6ce2d2f43"). InnerVolumeSpecName "kube-api-access-q28n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.033568 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-kube-api-access-8pwcc" (OuterVolumeSpecName: "kube-api-access-8pwcc") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "kube-api-access-8pwcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.042377 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079033 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079307 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw92t\" (UniqueName: \"kubernetes.io/projected/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-kube-api-access-vw92t\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079323 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c809e-76fd-458e-acbf-e2f6ce2d2f43-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079336 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n55k7\" (UniqueName: \"kubernetes.io/projected/fce6fe82-2dcb-49cd-851a-446e66038965-kube-api-access-n55k7\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079369 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q28n9\" (UniqueName: \"kubernetes.io/projected/117c809e-76fd-458e-acbf-e2f6ce2d2f43-kube-api-access-q28n9\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079403 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flbnc\" (UniqueName: \"kubernetes.io/projected/60b408db-1dec-49e0-8212-1193d4fe6a37-kube-api-access-flbnc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079415 4948 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c881bee3-e2f3-4da4-a12f-00db430e4323-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079446 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079461 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079472 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079484 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brx8x\" (UniqueName: \"kubernetes.io/projected/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-kube-api-access-brx8x\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079497 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079530 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szdnk\" (UniqueName: \"kubernetes.io/projected/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-api-access-szdnk\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079543 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b408db-1dec-49e0-8212-1193d4fe6a37-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079554 4948 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fce6fe82-2dcb-49cd-851a-446e66038965-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079631 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079647 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7k4t\" (UniqueName: \"kubernetes.io/projected/c881bee3-e2f3-4da4-a12f-00db430e4323-kube-api-access-j7k4t\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.079677 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.080114 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pwcc\" (UniqueName: \"kubernetes.io/projected/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-kube-api-access-8pwcc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.080371 4948 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.080397 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.080410 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.123658 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.180985 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq8tf\" (UniqueName: \"kubernetes.io/projected/214441b7-69b1-4518-a135-73de11d39a1d-kube-api-access-rq8tf\") pod \"214441b7-69b1-4518-a135-73de11d39a1d\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.181071 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-config-data\") pod \"214441b7-69b1-4518-a135-73de11d39a1d\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.181371 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-combined-ca-bundle\") pod \"214441b7-69b1-4518-a135-73de11d39a1d\" (UID: \"214441b7-69b1-4518-a135-73de11d39a1d\") " Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.209532 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214441b7-69b1-4518-a135-73de11d39a1d-kube-api-access-rq8tf" (OuterVolumeSpecName: "kube-api-access-rq8tf") pod "214441b7-69b1-4518-a135-73de11d39a1d" (UID: "214441b7-69b1-4518-a135-73de11d39a1d"). InnerVolumeSpecName "kube-api-access-rq8tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.287777 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq8tf\" (UniqueName: \"kubernetes.io/projected/214441b7-69b1-4518-a135-73de11d39a1d-kube-api-access-rq8tf\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.287879 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-config-data" (OuterVolumeSpecName: "config-data") pod "214441b7-69b1-4518-a135-73de11d39a1d" (UID: "214441b7-69b1-4518-a135-73de11d39a1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.323290 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.344841 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" (UID: "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.376215 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-config-data" (OuterVolumeSpecName: "config-data") pod "117c809e-76fd-458e-acbf-e2f6ce2d2f43" (UID: "117c809e-76fd-458e-acbf-e2f6ce2d2f43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.397436 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.397477 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.397492 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.397505 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.406885 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.407432 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc","Type":"ContainerDied","Data":"00936facc0bffe2a77c58d42db5363bccdb3d50e0aa8002ebd274826443e9d1a"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.407476 4948 scope.go:117] "RemoveContainer" containerID="bf58486ad5fb8f9b6caf20551fac4a932d36fd2cc04ef2bf024f6aa264c91c35" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.448203 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c08574c-af0f-4e7c-81af-b180b29ce4ee" (UID: "0c08574c-af0f-4e7c-81af-b180b29ce4ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.452324 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d56c8fbdd-fr7fc" event={"ID":"117c809e-76fd-458e-acbf-e2f6ce2d2f43","Type":"ContainerDied","Data":"019d30a5e2404a89eb9f84fd9ea2bf3d3d9be0599889ea6459d70cfb030b8b03"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.452342 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d56c8fbdd-fr7fc" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.457659 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d5f54fb74-68pcc" event={"ID":"be3e0d09-a01a-4f1c-9fbd-60a23a823e31","Type":"ContainerDied","Data":"6efe33eec024e9574e9af94a9bbfb0beadb939e856ba7628ea939a53809e5a5b"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.459508 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d5f54fb74-68pcc" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.461483 4948 generic.go:334] "Generic (PLEG): container finished" podID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerID="cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a" exitCode=0 Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.461570 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerDied","Data":"cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.464692 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60b408db-1dec-49e0-8212-1193d4fe6a37","Type":"ContainerDied","Data":"933ef6eec024c35e2fa3dcfa2ae62aafc963a1c5d046edbe6567a3b945c72a75"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.464826 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.466502 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fce6fe82-2dcb-49cd-851a-446e66038965","Type":"ContainerDied","Data":"e2d81f333d32bce4bcb749b5b7cbf4de2ffaf7cb31712c52e8436d033d22cddd"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.467674 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.468297 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3","Type":"ContainerDied","Data":"465e42b51728405a405c62806bb707352238117837983114b872ee8862b54dfe"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.468378 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.480162 4948 generic.go:334] "Generic (PLEG): container finished" podID="214441b7-69b1-4518-a135-73de11d39a1d" containerID="1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f" exitCode=0 Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.481204 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.481218 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"214441b7-69b1-4518-a135-73de11d39a1d","Type":"ContainerDied","Data":"1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.482195 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"214441b7-69b1-4518-a135-73de11d39a1d","Type":"ContainerDied","Data":"217ca86a7b9acc6c236f585c2eec3413fa21473be47f3842fd32b9477abf3d12"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.487810 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0a2da-account-delete-2tst9" event={"ID":"e1b64e38-8be0-41af-bf89-878d17bbd7a5","Type":"ContainerStarted","Data":"336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.491320 4948 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0a2da-account-delete-2tst9" secret="" err="secret \"galera-openstack-dockercfg-9d9sk\" not found" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.495332 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.497409 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1e276-account-delete-5xw8n" event={"ID":"4d563e18-b478-40af-b4c6-b2dd89ea863a","Type":"ContainerStarted","Data":"70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.497581 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell1e276-account-delete-5xw8n" podUID="4d563e18-b478-40af-b4c6-b2dd89ea863a" containerName="mariadb-account-delete" containerID="cri-o://70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3" gracePeriod=30 Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.499350 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60b408db-1dec-49e0-8212-1193d4fe6a37" (UID: "60b408db-1dec-49e0-8212-1193d4fe6a37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.499760 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-combined-ca-bundle\") pod \"60b408db-1dec-49e0-8212-1193d4fe6a37\" (UID: \"60b408db-1dec-49e0-8212-1193d4fe6a37\") " Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.501171 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.501197 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: W1204 18:00:38.501778 4948 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/60b408db-1dec-49e0-8212-1193d4fe6a37/volumes/kubernetes.io~secret/combined-ca-bundle Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.501824 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60b408db-1dec-49e0-8212-1193d4fe6a37" (UID: "60b408db-1dec-49e0-8212-1193d4fe6a37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.515787 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" (UID: "4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.515998 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" (UID: "4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.523110 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.523826 4948 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement7046-account-delete-d78kq" secret="" err="secret \"galera-openstack-dockercfg-9d9sk\" not found" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.524304 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement7046-account-delete-d78kq" event={"ID":"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591","Type":"ContainerStarted","Data":"3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2"} Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.524562 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.524803 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.541164 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c881bee3-e2f3-4da4-a12f-00db430e4323" (UID: "c881bee3-e2f3-4da4-a12f-00db430e4323"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.548021 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement7046-account-delete-d78kq" podStartSLOduration=7.548002719 podStartE2EDuration="7.548002719s" podCreationTimestamp="2025-12-04 18:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 18:00:38.540526163 +0000 UTC m=+2049.901600565" watchObservedRunningTime="2025-12-04 18:00:38.548002719 +0000 UTC m=+2049.909077121" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.559121 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell1e276-account-delete-5xw8n" podStartSLOduration=7.559034379 podStartE2EDuration="7.559034379s" podCreationTimestamp="2025-12-04 18:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 18:00:38.556188604 +0000 UTC m=+2049.917263006" watchObservedRunningTime="2025-12-04 18:00:38.559034379 +0000 UTC m=+2049.920108781" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.559872 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be3e0d09-a01a-4f1c-9fbd-60a23a823e31" (UID: "be3e0d09-a01a-4f1c-9fbd-60a23a823e31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.586135 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.588566 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell0a2da-account-delete-2tst9" podStartSLOduration=7.588548312 podStartE2EDuration="7.588548312s" podCreationTimestamp="2025-12-04 18:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 18:00:38.576307612 +0000 UTC m=+2049.937382014" watchObservedRunningTime="2025-12-04 18:00:38.588548312 +0000 UTC m=+2049.949622714" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.596876 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" (UID: "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.602932 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.602971 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.602984 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.602995 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.603006 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.603018 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.603030 4948 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.603121 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.603178 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts podName:e1b64e38-8be0-41af-bf89-878d17bbd7a5 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:39.103160195 +0000 UTC m=+2050.464234597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts") pod "novacell0a2da-account-delete-2tst9" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5") : configmap "openstack-scripts" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.603794 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.603836 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts podName:fbfcb6f8-1a5c-4de0-a75a-331dfcb39591 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:39.103826103 +0000 UTC m=+2050.464900515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts") pod "placement7046-account-delete-d78kq" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591") : configmap "openstack-scripts" not found Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.634361 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data" (OuterVolumeSpecName: "config-data") pod "be3e0d09-a01a-4f1c-9fbd-60a23a823e31" (UID: "be3e0d09-a01a-4f1c-9fbd-60a23a823e31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.639128 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-config-data" (OuterVolumeSpecName: "config-data") pod "c881bee3-e2f3-4da4-a12f-00db430e4323" (UID: "c881bee3-e2f3-4da4-a12f-00db430e4323"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.646476 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fce6fe82-2dcb-49cd-851a-446e66038965" (UID: "fce6fe82-2dcb-49cd-851a-446e66038965"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.672463 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "214441b7-69b1-4518-a135-73de11d39a1d" (UID: "214441b7-69b1-4518-a135-73de11d39a1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.704180 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "fce6fe82-2dcb-49cd-851a-446e66038965" (UID: "fce6fe82-2dcb-49cd-851a-446e66038965"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.704785 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-memcached-tls-certs\") pod \"fce6fe82-2dcb-49cd-851a-446e66038965\" (UID: \"fce6fe82-2dcb-49cd-851a-446e66038965\") " Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.705167 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.705185 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214441b7-69b1-4518-a135-73de11d39a1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.705197 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.705207 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.705261 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.705303 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data podName:b34ca165-31d6-44fa-b175-ed2b1bf9f766 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:46.705290083 +0000 UTC m=+2058.066364485 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data") pod "rabbitmq-cell1-server-0" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766") : configmap "rabbitmq-cell1-config-data" not found Dec 04 18:00:38 crc kubenswrapper[4948]: W1204 18:00:38.705627 4948 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fce6fe82-2dcb-49cd-851a-446e66038965/volumes/kubernetes.io~secret/memcached-tls-certs Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.705642 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "fce6fe82-2dcb-49cd-851a-446e66038965" (UID: "fce6fe82-2dcb-49cd-851a-446e66038965"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.710478 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" (UID: "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.718223 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-config-data" (OuterVolumeSpecName: "config-data") pod "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" (UID: "bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.719408 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-config-data" (OuterVolumeSpecName: "config-data") pod "60b408db-1dec-49e0-8212-1193d4fe6a37" (UID: "60b408db-1dec-49e0-8212-1193d4fe6a37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.724517 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-config-data" (OuterVolumeSpecName: "config-data") pod "0c08574c-af0f-4e7c-81af-b180b29ce4ee" (UID: "0c08574c-af0f-4e7c-81af-b180b29ce4ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.729050 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "60b408db-1dec-49e0-8212-1193d4fe6a37" (UID: "60b408db-1dec-49e0-8212-1193d4fe6a37"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.739984 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.752437 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be3e0d09-a01a-4f1c-9fbd-60a23a823e31" (UID: "be3e0d09-a01a-4f1c-9fbd-60a23a823e31"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.755651 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data" (OuterVolumeSpecName: "config-data") pod "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" (UID: "dfdde2fd-5c98-4b6f-b9a5-a746a454fafd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.761289 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "117c809e-76fd-458e-acbf-e2f6ce2d2f43" (UID: "117c809e-76fd-458e-acbf-e2f6ce2d2f43"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.797315 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "117c809e-76fd-458e-acbf-e2f6ce2d2f43" (UID: "117c809e-76fd-458e-acbf-e2f6ce2d2f43"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.800619 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.802203 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.802919 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.803182 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806525 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.806559 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.806591 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.806605 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806630 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:40.806610199 +0000 UTC m=+2052.167684601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-scripts" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806654 4948 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806687 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-scripts: configmap "ovnnorthd-scripts" not found Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.806695 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.806711 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806736 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:40.806713102 +0000 UTC m=+2052.167787504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kolla-config" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-config-data" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806664 4948 configmap.go:193] Couldn't get configMap openstack/openstack-config-data: configmap "openstack-config-data" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806748 4948 configmap.go:193] Couldn't get configMap openstack/ovnnorthd-config: configmap "ovnnorthd-config" not found Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.806759 4948 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/fce6fe82-2dcb-49cd-851a-446e66038965-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806777 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:46.806767373 +0000 UTC m=+2058.167841775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-scripts" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806791 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config podName:b6b365e8-6c2a-41fe-b50a-1702144d67d4 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:46.806785163 +0000 UTC m=+2058.167859565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config") pod "ovn-northd-0" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4") : configmap "ovnnorthd-config" not found Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.806804 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default podName:27244fac-7ff8-4ca0-9002-ef85f78a2564 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:40.806797944 +0000 UTC m=+2052.167872336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-default" (UniqueName: "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default") pod "openstack-galera-0" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564") : configmap "openstack-config-data" not found Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.806802 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.807325 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.807340 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.807395 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.807409 4948 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b408db-1dec-49e0-8212-1193d4fe6a37-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.817541 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.817828 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.836989 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.837183 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.840673 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.840733 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "be3e0d09-a01a-4f1c-9fbd-60a23a823e31" (UID: "be3e0d09-a01a-4f1c-9fbd-60a23a823e31"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.840748 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="ovn-northd" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.841200 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0c08574c-af0f-4e7c-81af-b180b29ce4ee" (UID: "0c08574c-af0f-4e7c-81af-b180b29ce4ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.841538 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:38 crc kubenswrapper[4948]: E1204 18:00:38.841582 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.844617 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c881bee3-e2f3-4da4-a12f-00db430e4323" (UID: "c881bee3-e2f3-4da4-a12f-00db430e4323"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.845367 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" (UID: "4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.884508 4948 scope.go:117] "RemoveContainer" containerID="228923eb8a21101983b2bce76096ee36269db4118b9d4b8fa58a9ef47c3110a3" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.912899 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.916726 4948 scope.go:117] "RemoveContainer" containerID="648062ad89f6bf56de82ee3bd52951b86df3e900c322ee6b0c57f21f40ba73d8" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.917920 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c881bee3-e2f3-4da4-a12f-00db430e4323-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.918762 4948 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.918889 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be3e0d09-a01a-4f1c-9fbd-60a23a823e31-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.918958 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c08574c-af0f-4e7c-81af-b180b29ce4ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.921543 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "117c809e-76fd-458e-acbf-e2f6ce2d2f43" (UID: "117c809e-76fd-458e-acbf-e2f6ce2d2f43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.939864 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0913a9-d96e-404a-9ece-85dc07caad20" path="/var/lib/kubelet/pods/8d0913a9-d96e-404a-9ece-85dc07caad20/volumes" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.941018 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad839bcb-16b3-4321-8cf7-5e698ea7b32d" path="/var/lib/kubelet/pods/ad839bcb-16b3-4321-8cf7-5e698ea7b32d/volumes" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.941870 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee248375-d52b-46cc-bef6-c6a53f95537e" path="/var/lib/kubelet/pods/ee248375-d52b-46cc-bef6-c6a53f95537e/volumes" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.942694 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a" path="/var/lib/kubelet/pods/f2ddf4ea-25cd-4ee7-90db-b829ba11ce6a/volumes" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.945774 4948 scope.go:117] "RemoveContainer" containerID="6b974c2237fa21ad81c9fc52a94fd34f294e0f3775cb43b085b2410092be36d4" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.985212 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.985315 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.985648 4948 scope.go:117] "RemoveContainer" containerID="b6f89e0991c69b6cee27c2fa1ab82523519a1bb22b60f0d9a6b4e7f17f7f22bc" Dec 04 18:00:38 crc kubenswrapper[4948]: I1204 18:00:38.989130 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.018833 4948 scope.go:117] "RemoveContainer" containerID="f3c7b7339517046484e3d5e33d506a76290c1be3ff41874ab17e7a9348fa892a" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.019839 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzf26\" (UniqueName: \"kubernetes.io/projected/9acee6d3-23af-4793-8e56-8f3fbc169779-kube-api-access-jzf26\") pod \"9acee6d3-23af-4793-8e56-8f3fbc169779\" (UID: \"9acee6d3-23af-4793-8e56-8f3fbc169779\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.022414 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9acee6d3-23af-4793-8e56-8f3fbc169779-operator-scripts\") pod \"9acee6d3-23af-4793-8e56-8f3fbc169779\" (UID: \"9acee6d3-23af-4793-8e56-8f3fbc169779\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.022928 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9acee6d3-23af-4793-8e56-8f3fbc169779-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9acee6d3-23af-4793-8e56-8f3fbc169779" (UID: "9acee6d3-23af-4793-8e56-8f3fbc169779"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.024049 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c809e-76fd-458e-acbf-e2f6ce2d2f43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.024074 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9acee6d3-23af-4793-8e56-8f3fbc169779-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.027218 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.038012 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.050643 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.052213 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9acee6d3-23af-4793-8e56-8f3fbc169779-kube-api-access-jzf26" (OuterVolumeSpecName: "kube-api-access-jzf26") pod "9acee6d3-23af-4793-8e56-8f3fbc169779" (UID: "9acee6d3-23af-4793-8e56-8f3fbc169779"). InnerVolumeSpecName "kube-api-access-jzf26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.057245 4948 scope.go:117] "RemoveContainer" containerID="87d492aabd3820482e4066aa4ce2c353d0f6250c208d12c75d4f38c935248ea8" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.059432 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.071192 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.074525 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.201:3000/\": dial tcp 10.217.0.201:3000: connect: connection refused" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.078562 4948 scope.go:117] "RemoveContainer" containerID="ca9cbacbe11dab0d67e4befee314eff5c109090f17e6413c4b39e087ea4c3f46" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.080135 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.091184 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.099171 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.106875 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.126119 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzf26\" (UniqueName: \"kubernetes.io/projected/9acee6d3-23af-4793-8e56-8f3fbc169779-kube-api-access-jzf26\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.126284 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.126462 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts podName:fbfcb6f8-1a5c-4de0-a75a-331dfcb39591 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:40.126443403 +0000 UTC m=+2051.487517805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts") pod "placement7046-account-delete-d78kq" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591") : configmap "openstack-scripts" not found Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.126859 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.126901 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts podName:e1b64e38-8be0-41af-bf89-878d17bbd7a5 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:40.126888975 +0000 UTC m=+2051.487963377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts") pod "novacell0a2da-account-delete-2tst9" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5") : configmap "openstack-scripts" not found Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.127940 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.140329 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.146573 4948 scope.go:117] "RemoveContainer" containerID="12e734767396eb518b40a349afac5356ca256b1870d63b32a4acf4a594db67b0" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.151161 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d56c8fbdd-fr7fc"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.159104 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d56c8fbdd-fr7fc"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.166737 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d5f54fb74-68pcc"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.174178 4948 scope.go:117] "RemoveContainer" containerID="7ac6688f04690776901f6c203c84667f87067c95bbc8e52dbe5b9d9106e8a071" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.174397 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d5f54fb74-68pcc"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.180864 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.211169 4948 scope.go:117] "RemoveContainer" containerID="1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.217296 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.229293 4948 scope.go:117] "RemoveContainer" containerID="1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f" Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.233902 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f\": container with ID starting with 1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f not found: ID does not exist" containerID="1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.233953 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f"} err="failed to get container status \"1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f\": rpc error: code = NotFound desc = could not find container \"1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f\": container with ID starting with 1a522408b3ab57ef7337f61de67168b2ed8882d71354b5e190a20eb10140206f not found: ID does not exist" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.278849 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.280507 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.304447 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.305920 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.322271 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.322362 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bbda827a-8528-4b7f-8d4c-70fe8be65d27" containerName="nova-scheduler-scheduler" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.331848 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d563e18-b478-40af-b4c6-b2dd89ea863a-operator-scripts\") pod \"4d563e18-b478-40af-b4c6-b2dd89ea863a\" (UID: \"4d563e18-b478-40af-b4c6-b2dd89ea863a\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.331980 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr9dp\" (UniqueName: \"kubernetes.io/projected/80f9ff11-f145-4e76-a9fc-084de8ccb029-kube-api-access-rr9dp\") pod \"80f9ff11-f145-4e76-a9fc-084de8ccb029\" (UID: \"80f9ff11-f145-4e76-a9fc-084de8ccb029\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.332052 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f9ff11-f145-4e76-a9fc-084de8ccb029-operator-scripts\") pod \"80f9ff11-f145-4e76-a9fc-084de8ccb029\" (UID: \"80f9ff11-f145-4e76-a9fc-084de8ccb029\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.332243 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtr9\" (UniqueName: \"kubernetes.io/projected/4d563e18-b478-40af-b4c6-b2dd89ea863a-kube-api-access-4wtr9\") pod \"4d563e18-b478-40af-b4c6-b2dd89ea863a\" (UID: \"4d563e18-b478-40af-b4c6-b2dd89ea863a\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.346642 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80f9ff11-f145-4e76-a9fc-084de8ccb029-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80f9ff11-f145-4e76-a9fc-084de8ccb029" (UID: "80f9ff11-f145-4e76-a9fc-084de8ccb029"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.348107 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d563e18-b478-40af-b4c6-b2dd89ea863a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d563e18-b478-40af-b4c6-b2dd89ea863a" (UID: "4d563e18-b478-40af-b4c6-b2dd89ea863a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.353233 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.355596 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d563e18-b478-40af-b4c6-b2dd89ea863a-kube-api-access-4wtr9" (OuterVolumeSpecName: "kube-api-access-4wtr9") pod "4d563e18-b478-40af-b4c6-b2dd89ea863a" (UID: "4d563e18-b478-40af-b4c6-b2dd89ea863a"). InnerVolumeSpecName "kube-api-access-4wtr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.357344 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f9ff11-f145-4e76-a9fc-084de8ccb029-kube-api-access-rr9dp" (OuterVolumeSpecName: "kube-api-access-rr9dp") pod "80f9ff11-f145-4e76-a9fc-084de8ccb029" (UID: "80f9ff11-f145-4e76-a9fc-084de8ccb029"). InnerVolumeSpecName "kube-api-access-rr9dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.364631 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.436335 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e5cc30-bac1-418c-af51-af5cb1d8d595-operator-scripts\") pod \"31e5cc30-bac1-418c-af51-af5cb1d8d595\" (UID: \"31e5cc30-bac1-418c-af51-af5cb1d8d595\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.436395 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59806891-9fa2-446a-87c1-b7efbf4b692b-operator-scripts\") pod \"59806891-9fa2-446a-87c1-b7efbf4b692b\" (UID: \"59806891-9fa2-446a-87c1-b7efbf4b692b\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.436835 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpwvg\" (UniqueName: \"kubernetes.io/projected/31e5cc30-bac1-418c-af51-af5cb1d8d595-kube-api-access-vpwvg\") pod \"31e5cc30-bac1-418c-af51-af5cb1d8d595\" (UID: \"31e5cc30-bac1-418c-af51-af5cb1d8d595\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.436871 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fpmf\" (UniqueName: \"kubernetes.io/projected/59806891-9fa2-446a-87c1-b7efbf4b692b-kube-api-access-4fpmf\") pod \"59806891-9fa2-446a-87c1-b7efbf4b692b\" (UID: \"59806891-9fa2-446a-87c1-b7efbf4b692b\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.437347 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr9dp\" (UniqueName: \"kubernetes.io/projected/80f9ff11-f145-4e76-a9fc-084de8ccb029-kube-api-access-rr9dp\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.437363 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80f9ff11-f145-4e76-a9fc-084de8ccb029-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.437373 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtr9\" (UniqueName: \"kubernetes.io/projected/4d563e18-b478-40af-b4c6-b2dd89ea863a-kube-api-access-4wtr9\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.437382 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d563e18-b478-40af-b4c6-b2dd89ea863a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.438600 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31e5cc30-bac1-418c-af51-af5cb1d8d595-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31e5cc30-bac1-418c-af51-af5cb1d8d595" (UID: "31e5cc30-bac1-418c-af51-af5cb1d8d595"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.438692 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59806891-9fa2-446a-87c1-b7efbf4b692b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59806891-9fa2-446a-87c1-b7efbf4b692b" (UID: "59806891-9fa2-446a-87c1-b7efbf4b692b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.440965 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59806891-9fa2-446a-87c1-b7efbf4b692b-kube-api-access-4fpmf" (OuterVolumeSpecName: "kube-api-access-4fpmf") pod "59806891-9fa2-446a-87c1-b7efbf4b692b" (UID: "59806891-9fa2-446a-87c1-b7efbf4b692b"). InnerVolumeSpecName "kube-api-access-4fpmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.441854 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e5cc30-bac1-418c-af51-af5cb1d8d595-kube-api-access-vpwvg" (OuterVolumeSpecName: "kube-api-access-vpwvg") pod "31e5cc30-bac1-418c-af51-af5cb1d8d595" (UID: "31e5cc30-bac1-418c-af51-af5cb1d8d595"). InnerVolumeSpecName "kube-api-access-vpwvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.535503 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.539104 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31e5cc30-bac1-418c-af51-af5cb1d8d595-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.539133 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59806891-9fa2-446a-87c1-b7efbf4b692b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.539142 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpwvg\" (UniqueName: \"kubernetes.io/projected/31e5cc30-bac1-418c-af51-af5cb1d8d595-kube-api-access-vpwvg\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.539151 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fpmf\" (UniqueName: \"kubernetes.io/projected/59806891-9fa2-446a-87c1-b7efbf4b692b-kube-api-access-4fpmf\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.546540 4948 generic.go:334] "Generic (PLEG): container finished" podID="4d563e18-b478-40af-b4c6-b2dd89ea863a" containerID="70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3" exitCode=1 Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.546627 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1e276-account-delete-5xw8n" event={"ID":"4d563e18-b478-40af-b4c6-b2dd89ea863a","Type":"ContainerDied","Data":"70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3"} Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.546637 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1e276-account-delete-5xw8n" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.546666 4948 scope.go:117] "RemoveContainer" containerID="70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.546652 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1e276-account-delete-5xw8n" event={"ID":"4d563e18-b478-40af-b4c6-b2dd89ea863a","Type":"ContainerDied","Data":"5aaf8d1d51e73c6ee336638463fec279588906f1fe784b96badda2d3faa37666"} Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.554416 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder20a5-account-delete-b2bnv" event={"ID":"31e5cc30-bac1-418c-af51-af5cb1d8d595","Type":"ContainerDied","Data":"d311eed4082742e9e355378e89115c1568f4c2513b74e4ce4b82dfb716efc259"} Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.554631 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d311eed4082742e9e355378e89115c1568f4c2513b74e4ce4b82dfb716efc259" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.554662 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder20a5-account-delete-b2bnv" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.559794 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanaee8-account-delete-r5gkz" event={"ID":"9acee6d3-23af-4793-8e56-8f3fbc169779","Type":"ContainerDied","Data":"3ba5a2d6c81678c189f381dbb956b8fca5e299a07868c418916fb3e924c06e2a"} Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.559834 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ba5a2d6c81678c189f381dbb956b8fca5e299a07868c418916fb3e924c06e2a" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.559914 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanaee8-account-delete-r5gkz" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.574463 4948 generic.go:334] "Generic (PLEG): container finished" podID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerID="de019385e7338481198dc33686e0126bb41672f2effc6fd4c866ef06770f14f7" exitCode=0 Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.574552 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b34ca165-31d6-44fa-b175-ed2b1bf9f766","Type":"ContainerDied","Data":"de019385e7338481198dc33686e0126bb41672f2effc6fd4c866ef06770f14f7"} Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.586631 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron4bb1-account-delete-4fsjg" event={"ID":"59806891-9fa2-446a-87c1-b7efbf4b692b","Type":"ContainerDied","Data":"403bba4c9b8c6be6f89387f93ad9236f22b8236dff6af9a494652ac5021e7856"} Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.586667 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="403bba4c9b8c6be6f89387f93ad9236f22b8236dff6af9a494652ac5021e7856" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.586722 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron4bb1-account-delete-4fsjg" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.590309 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapide93-account-delete-s9wkh" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.590300 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapide93-account-delete-s9wkh" event={"ID":"48d8f605-3274-40ec-8a30-8dc188fdcd86","Type":"ContainerDied","Data":"4311ac794bbba6c28715202d1fa2559048c4ea66aa409b3ff1f8b56f625021bb"} Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.590486 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4311ac794bbba6c28715202d1fa2559048c4ea66aa409b3ff1f8b56f625021bb" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.593665 4948 scope.go:117] "RemoveContainer" containerID="70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3" Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.595530 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3\": container with ID starting with 70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3 not found: ID does not exist" containerID="70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.595592 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3"} err="failed to get container status \"70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3\": rpc error: code = NotFound desc = could not find container \"70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3\": container with ID starting with 70efdf0fe30fade538e4f16d32727a36eab6fbf0d903f89e006c52b17b050cd3 not found: ID does not exist" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.606015 4948 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement7046-account-delete-d78kq" secret="" err="secret \"galera-openstack-dockercfg-9d9sk\" not found" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.606590 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance47ce-account-delete-9mwp2" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.606684 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance47ce-account-delete-9mwp2" event={"ID":"80f9ff11-f145-4e76-a9fc-084de8ccb029","Type":"ContainerDied","Data":"979c7fd0ffb2f20249d54d4e390608046276ce5bb279de1d81fdd19a936f1f77"} Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.606754 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="979c7fd0ffb2f20249d54d4e390608046276ce5bb279de1d81fdd19a936f1f77" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.607022 4948 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0a2da-account-delete-2tst9" secret="" err="secret \"galera-openstack-dockercfg-9d9sk\" not found" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.640655 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl2gh\" (UniqueName: \"kubernetes.io/projected/48d8f605-3274-40ec-8a30-8dc188fdcd86-kube-api-access-cl2gh\") pod \"48d8f605-3274-40ec-8a30-8dc188fdcd86\" (UID: \"48d8f605-3274-40ec-8a30-8dc188fdcd86\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.640739 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48d8f605-3274-40ec-8a30-8dc188fdcd86-operator-scripts\") pod \"48d8f605-3274-40ec-8a30-8dc188fdcd86\" (UID: \"48d8f605-3274-40ec-8a30-8dc188fdcd86\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.642658 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48d8f605-3274-40ec-8a30-8dc188fdcd86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48d8f605-3274-40ec-8a30-8dc188fdcd86" (UID: "48d8f605-3274-40ec-8a30-8dc188fdcd86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.647180 4948 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 18:00:39 crc kubenswrapper[4948]: E1204 18:00:39.647249 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data podName:90b4baf7-8366-4f47-8515-c33e1b691856 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:47.647230776 +0000 UTC m=+2059.008305258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data") pod "rabbitmq-server-0" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856") : configmap "rabbitmq-config-data" not found Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.655154 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48d8f605-3274-40ec-8a30-8dc188fdcd86-kube-api-access-cl2gh" (OuterVolumeSpecName: "kube-api-access-cl2gh") pod "48d8f605-3274-40ec-8a30-8dc188fdcd86" (UID: "48d8f605-3274-40ec-8a30-8dc188fdcd86"). InnerVolumeSpecName "kube-api-access-cl2gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.661551 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1e276-account-delete-5xw8n"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.677152 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1e276-account-delete-5xw8n"] Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.730667 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.743582 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl2gh\" (UniqueName: \"kubernetes.io/projected/48d8f605-3274-40ec-8a30-8dc188fdcd86-kube-api-access-cl2gh\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.743607 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48d8f605-3274-40ec-8a30-8dc188fdcd86-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.825807 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946612 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-tls\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946648 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946671 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-server-conf\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946699 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946755 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b34ca165-31d6-44fa-b175-ed2b1bf9f766-pod-info\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946806 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-erlang-cookie\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946846 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-confd\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946877 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b34ca165-31d6-44fa-b175-ed2b1bf9f766-erlang-cookie-secret\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946897 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcf5l\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-kube-api-access-hcf5l\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946950 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-plugins\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.946971 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-plugins-conf\") pod \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\" (UID: \"b34ca165-31d6-44fa-b175-ed2b1bf9f766\") " Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.948469 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.948982 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.949612 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.949894 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.950624 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-kube-api-access-hcf5l" (OuterVolumeSpecName: "kube-api-access-hcf5l") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "kube-api-access-hcf5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.953803 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b34ca165-31d6-44fa-b175-ed2b1bf9f766-pod-info" (OuterVolumeSpecName: "pod-info") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.954024 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.954176 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b34ca165-31d6-44fa-b175-ed2b1bf9f766-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:39 crc kubenswrapper[4948]: I1204 18:00:39.974513 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data" (OuterVolumeSpecName: "config-data") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.005356 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-server-conf" (OuterVolumeSpecName: "server-conf") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048381 4948 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048413 4948 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b34ca165-31d6-44fa-b175-ed2b1bf9f766-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048423 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcf5l\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-kube-api-access-hcf5l\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048432 4948 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048460 4948 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048469 4948 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048498 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048506 4948 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048532 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34ca165-31d6-44fa-b175-ed2b1bf9f766-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.048542 4948 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b34ca165-31d6-44fa-b175-ed2b1bf9f766-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.087722 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.120293 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.149572 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-combined-ca-bundle\") pod \"27244fac-7ff8-4ca0-9002-ef85f78a2564\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.149956 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"27244fac-7ff8-4ca0-9002-ef85f78a2564\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.150100 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs7sv\" (UniqueName: \"kubernetes.io/projected/27244fac-7ff8-4ca0-9002-ef85f78a2564-kube-api-access-gs7sv\") pod \"27244fac-7ff8-4ca0-9002-ef85f78a2564\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.150604 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts\") pod \"27244fac-7ff8-4ca0-9002-ef85f78a2564\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.150707 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-generated\") pod \"27244fac-7ff8-4ca0-9002-ef85f78a2564\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.150804 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config\") pod \"27244fac-7ff8-4ca0-9002-ef85f78a2564\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.150897 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-galera-tls-certs\") pod \"27244fac-7ff8-4ca0-9002-ef85f78a2564\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.150983 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default\") pod \"27244fac-7ff8-4ca0-9002-ef85f78a2564\" (UID: \"27244fac-7ff8-4ca0-9002-ef85f78a2564\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.151459 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.151443 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "27244fac-7ff8-4ca0-9002-ef85f78a2564" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.151571 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "27244fac-7ff8-4ca0-9002-ef85f78a2564" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: E1204 18:00:40.151794 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:40 crc kubenswrapper[4948]: E1204 18:00:40.151923 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:40 crc kubenswrapper[4948]: E1204 18:00:40.151993 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts podName:fbfcb6f8-1a5c-4de0-a75a-331dfcb39591 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:42.151913476 +0000 UTC m=+2053.512987878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts") pod "placement7046-account-delete-d78kq" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591") : configmap "openstack-scripts" not found Dec 04 18:00:40 crc kubenswrapper[4948]: E1204 18:00:40.152144 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts podName:e1b64e38-8be0-41af-bf89-878d17bbd7a5 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:42.152130992 +0000 UTC m=+2053.513205394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts") pod "novacell0a2da-account-delete-2tst9" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5") : configmap "openstack-scripts" not found Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.152161 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "27244fac-7ff8-4ca0-9002-ef85f78a2564" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.152082 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27244fac-7ff8-4ca0-9002-ef85f78a2564" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.155431 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27244fac-7ff8-4ca0-9002-ef85f78a2564-kube-api-access-gs7sv" (OuterVolumeSpecName: "kube-api-access-gs7sv") pod "27244fac-7ff8-4ca0-9002-ef85f78a2564" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564"). InnerVolumeSpecName "kube-api-access-gs7sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.155553 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b34ca165-31d6-44fa-b175-ed2b1bf9f766" (UID: "b34ca165-31d6-44fa-b175-ed2b1bf9f766"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.171322 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "27244fac-7ff8-4ca0-9002-ef85f78a2564" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.175002 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b6b365e8-6c2a-41fe-b50a-1702144d67d4/ovn-northd/0.log" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.175085 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.181745 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27244fac-7ff8-4ca0-9002-ef85f78a2564" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.203107 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "27244fac-7ff8-4ca0-9002-ef85f78a2564" (UID: "27244fac-7ff8-4ca0-9002-ef85f78a2564"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.252630 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts\") pod \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.253155 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-northd-tls-certs\") pod \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.253201 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-rundir\") pod \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.253279 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-combined-ca-bundle\") pod \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.253355 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8sn7\" (UniqueName: \"kubernetes.io/projected/b6b365e8-6c2a-41fe-b50a-1702144d67d4-kube-api-access-k8sn7\") pod \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.253402 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config\") pod \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.253433 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-metrics-certs-tls-certs\") pod \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\" (UID: \"b6b365e8-6c2a-41fe-b50a-1702144d67d4\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.253688 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts" (OuterVolumeSpecName: "scripts") pod "b6b365e8-6c2a-41fe-b50a-1702144d67d4" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254172 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config" (OuterVolumeSpecName: "config") pod "b6b365e8-6c2a-41fe-b50a-1702144d67d4" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254409 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254450 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254463 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs7sv\" (UniqueName: \"kubernetes.io/projected/27244fac-7ff8-4ca0-9002-ef85f78a2564-kube-api-access-gs7sv\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254478 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254489 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254500 4948 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254513 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254525 4948 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27244fac-7ff8-4ca0-9002-ef85f78a2564-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254539 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6b365e8-6c2a-41fe-b50a-1702144d67d4-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254549 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27244fac-7ff8-4ca0-9002-ef85f78a2564-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254561 4948 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b34ca165-31d6-44fa-b175-ed2b1bf9f766-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.254699 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b6b365e8-6c2a-41fe-b50a-1702144d67d4" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.277433 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b365e8-6c2a-41fe-b50a-1702144d67d4-kube-api-access-k8sn7" (OuterVolumeSpecName: "kube-api-access-k8sn7") pod "b6b365e8-6c2a-41fe-b50a-1702144d67d4" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4"). InnerVolumeSpecName "kube-api-access-k8sn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.287886 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.297986 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6b365e8-6c2a-41fe-b50a-1702144d67d4" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.350168 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "b6b365e8-6c2a-41fe-b50a-1702144d67d4" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.357011 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.357194 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8sn7\" (UniqueName: \"kubernetes.io/projected/b6b365e8-6c2a-41fe-b50a-1702144d67d4-kube-api-access-k8sn7\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.357216 4948 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.357271 4948 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6b365e8-6c2a-41fe-b50a-1702144d67d4-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.357289 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.373463 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b6b365e8-6c2a-41fe-b50a-1702144d67d4" (UID: "b6b365e8-6c2a-41fe-b50a-1702144d67d4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.459437 4948 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b365e8-6c2a-41fe-b50a-1702144d67d4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.629391 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.629452 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.631514 4948 generic.go:334] "Generic (PLEG): container finished" podID="90b4baf7-8366-4f47-8515-c33e1b691856" containerID="ce3cf731c06ee83c40bae89c0c8e62893dd7be16f5ea71cde48d876fb17f3f41" exitCode=0 Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.631610 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"90b4baf7-8366-4f47-8515-c33e1b691856","Type":"ContainerDied","Data":"ce3cf731c06ee83c40bae89c0c8e62893dd7be16f5ea71cde48d876fb17f3f41"} Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.631635 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"90b4baf7-8366-4f47-8515-c33e1b691856","Type":"ContainerDied","Data":"7a8effce90210822af5a088861f11228c36d6225a004f984f8933d2fc185cf2d"} Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.631645 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a8effce90210822af5a088861f11228c36d6225a004f984f8933d2fc185cf2d" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.644638 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b34ca165-31d6-44fa-b175-ed2b1bf9f766","Type":"ContainerDied","Data":"620df2ab73d787ca0a2318f901ef4e5e794bd6131973c0c9fb9775dc461704e7"} Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.644698 4948 scope.go:117] "RemoveContainer" containerID="de019385e7338481198dc33686e0126bb41672f2effc6fd4c866ef06770f14f7" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.644862 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.663598 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.664000 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b6b365e8-6c2a-41fe-b50a-1702144d67d4/ovn-northd/0.log" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.664167 4948 generic.go:334] "Generic (PLEG): container finished" podID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" exitCode=139 Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.664258 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.664347 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b6b365e8-6c2a-41fe-b50a-1702144d67d4","Type":"ContainerDied","Data":"60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d"} Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.664428 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b6b365e8-6c2a-41fe-b50a-1702144d67d4","Type":"ContainerDied","Data":"de06dd9e6d7c027a8d179be989286ee149c6150e569ba7775bf42ec93e14bab0"} Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.671931 4948 generic.go:334] "Generic (PLEG): container finished" podID="27244fac-7ff8-4ca0-9002-ef85f78a2564" containerID="6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163" exitCode=0 Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.672011 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27244fac-7ff8-4ca0-9002-ef85f78a2564","Type":"ContainerDied","Data":"6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163"} Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.672167 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27244fac-7ff8-4ca0-9002-ef85f78a2564","Type":"ContainerDied","Data":"a8c0af5aae132e2544ef791c098e1d413a5a8ae119ff721087d8a2f7969a5c88"} Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.672284 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.702917 4948 generic.go:334] "Generic (PLEG): container finished" podID="fb168081-824d-45ef-a815-b96d44b58b7c" containerID="42f8d6eb61951e718daf3a1c3876ae5812998b0025aa93db9e52773f8f045f77" exitCode=0 Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.702958 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54df7858f8-fz456" event={"ID":"fb168081-824d-45ef-a815-b96d44b58b7c","Type":"ContainerDied","Data":"42f8d6eb61951e718daf3a1c3876ae5812998b0025aa93db9e52773f8f045f77"} Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.707139 4948 scope.go:117] "RemoveContainer" containerID="0faca2eff6b0bcf6f0f9c1e986baf52aab23458cefa4976735633696f679414d" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.732264 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.752181 4948 scope.go:117] "RemoveContainer" containerID="27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.755329 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.764505 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90b4baf7-8366-4f47-8515-c33e1b691856-pod-info\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.764936 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.764963 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-tls\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765058 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-confd\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765087 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765113 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hm6h\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-kube-api-access-4hm6h\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765165 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-plugins-conf\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765199 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-plugins\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765233 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-server-conf\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765254 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90b4baf7-8366-4f47-8515-c33e1b691856-erlang-cookie-secret\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765279 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-erlang-cookie\") pod \"90b4baf7-8366-4f47-8515-c33e1b691856\" (UID: \"90b4baf7-8366-4f47-8515-c33e1b691856\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.765780 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.766172 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.768494 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.772871 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.775095 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/90b4baf7-8366-4f47-8515-c33e1b691856-pod-info" (OuterVolumeSpecName: "pod-info") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.777341 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.791233 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.797799 4948 scope.go:117] "RemoveContainer" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.802299 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.811409 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-kube-api-access-4hm6h" (OuterVolumeSpecName: "kube-api-access-4hm6h") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "kube-api-access-4hm6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.811508 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.813491 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data" (OuterVolumeSpecName: "config-data") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.819852 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.821332 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b4baf7-8366-4f47-8515-c33e1b691856-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.834197 4948 scope.go:117] "RemoveContainer" containerID="27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205" Dec 04 18:00:40 crc kubenswrapper[4948]: E1204 18:00:40.839307 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205\": container with ID starting with 27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205 not found: ID does not exist" containerID="27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.839367 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205"} err="failed to get container status \"27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205\": rpc error: code = NotFound desc = could not find container \"27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205\": container with ID starting with 27c692d13273e40d1b775f969e0294df33bfb36b7bb82ba9af15bdb813042205 not found: ID does not exist" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.839403 4948 scope.go:117] "RemoveContainer" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" Dec 04 18:00:40 crc kubenswrapper[4948]: E1204 18:00:40.840024 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d\": container with ID starting with 60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d not found: ID does not exist" containerID="60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.840122 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d"} err="failed to get container status \"60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d\": rpc error: code = NotFound desc = could not find container \"60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d\": container with ID starting with 60fd28e1861b92829acc56f1c40db42fa97b537338de5d98bca8fd782bed388d not found: ID does not exist" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.840138 4948 scope.go:117] "RemoveContainer" containerID="6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867074 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867109 4948 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867135 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867148 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hm6h\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-kube-api-access-4hm6h\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867161 4948 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867172 4948 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867183 4948 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90b4baf7-8366-4f47-8515-c33e1b691856-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867195 4948 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.867206 4948 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90b4baf7-8366-4f47-8515-c33e1b691856-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.881443 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-server-conf" (OuterVolumeSpecName: "server-conf") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.882105 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54df7858f8-fz456" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.883557 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.889128 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6kbvb"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.890585 4948 scope.go:117] "RemoveContainer" containerID="5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.902115 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6kbvb"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.912294 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4bb1-account-create-update-lzs7x"] Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.931357 4948 scope.go:117] "RemoveContainer" containerID="6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163" Dec 04 18:00:40 crc kubenswrapper[4948]: E1204 18:00:40.936171 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163\": container with ID starting with 6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163 not found: ID does not exist" containerID="6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.936219 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163"} err="failed to get container status \"6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163\": rpc error: code = NotFound desc = could not find container \"6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163\": container with ID starting with 6afb1a03c3973aff582d59c9b9838a8415807c10b52642b69bc31b0390ad5163 not found: ID does not exist" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.936251 4948 scope.go:117] "RemoveContainer" containerID="5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db" Dec 04 18:00:40 crc kubenswrapper[4948]: E1204 18:00:40.939807 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db\": container with ID starting with 5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db not found: ID does not exist" containerID="5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.939841 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db"} err="failed to get container status \"5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db\": rpc error: code = NotFound desc = could not find container \"5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db\": container with ID starting with 5898ec30df462a796eaa81f3f0ce4ea184cbd70ed751f036417ddd2055db38db not found: ID does not exist" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.939981 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" path="/var/lib/kubelet/pods/0c08574c-af0f-4e7c-81af-b180b29ce4ee/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.940869 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" path="/var/lib/kubelet/pods/117c809e-76fd-458e-acbf-e2f6ce2d2f43/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.942106 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="145f54f7-b50a-4d77-8152-5d8986faa646" path="/var/lib/kubelet/pods/145f54f7-b50a-4d77-8152-5d8986faa646/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.942850 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214441b7-69b1-4518-a135-73de11d39a1d" path="/var/lib/kubelet/pods/214441b7-69b1-4518-a135-73de11d39a1d/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.943658 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27244fac-7ff8-4ca0-9002-ef85f78a2564" path="/var/lib/kubelet/pods/27244fac-7ff8-4ca0-9002-ef85f78a2564/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.945721 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" path="/var/lib/kubelet/pods/4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.946849 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d563e18-b478-40af-b4c6-b2dd89ea863a" path="/var/lib/kubelet/pods/4d563e18-b478-40af-b4c6-b2dd89ea863a/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.947473 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" path="/var/lib/kubelet/pods/60b408db-1dec-49e0-8212-1193d4fe6a37/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.948484 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "90b4baf7-8366-4f47-8515-c33e1b691856" (UID: "90b4baf7-8366-4f47-8515-c33e1b691856"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.948877 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" path="/var/lib/kubelet/pods/b34ca165-31d6-44fa-b175-ed2b1bf9f766/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.949647 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" path="/var/lib/kubelet/pods/b6b365e8-6c2a-41fe-b50a-1702144d67d4/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.950917 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" path="/var/lib/kubelet/pods/bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.951688 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" path="/var/lib/kubelet/pods/be3e0d09-a01a-4f1c-9fbd-60a23a823e31/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.952605 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" path="/var/lib/kubelet/pods/c881bee3-e2f3-4da4-a12f-00db430e4323/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.954141 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" path="/var/lib/kubelet/pods/dfdde2fd-5c98-4b6f-b9a5-a746a454fafd/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.954906 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fce6fe82-2dcb-49cd-851a-446e66038965" path="/var/lib/kubelet/pods/fce6fe82-2dcb-49cd-851a-446e66038965/volumes" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.967831 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-combined-ca-bundle\") pod \"fb168081-824d-45ef-a815-b96d44b58b7c\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.967883 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-public-tls-certs\") pod \"fb168081-824d-45ef-a815-b96d44b58b7c\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.967934 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-scripts\") pod \"fb168081-824d-45ef-a815-b96d44b58b7c\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.967976 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-credential-keys\") pod \"fb168081-824d-45ef-a815-b96d44b58b7c\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.967999 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-internal-tls-certs\") pod \"fb168081-824d-45ef-a815-b96d44b58b7c\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.968016 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqvkv\" (UniqueName: \"kubernetes.io/projected/fb168081-824d-45ef-a815-b96d44b58b7c-kube-api-access-rqvkv\") pod \"fb168081-824d-45ef-a815-b96d44b58b7c\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.968060 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-fernet-keys\") pod \"fb168081-824d-45ef-a815-b96d44b58b7c\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.968090 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-config-data\") pod \"fb168081-824d-45ef-a815-b96d44b58b7c\" (UID: \"fb168081-824d-45ef-a815-b96d44b58b7c\") " Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.968481 4948 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90b4baf7-8366-4f47-8515-c33e1b691856-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.968498 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.968508 4948 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90b4baf7-8366-4f47-8515-c33e1b691856-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.971595 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fb168081-824d-45ef-a815-b96d44b58b7c" (UID: "fb168081-824d-45ef-a815-b96d44b58b7c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.971627 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fb168081-824d-45ef-a815-b96d44b58b7c" (UID: "fb168081-824d-45ef-a815-b96d44b58b7c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.974220 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb168081-824d-45ef-a815-b96d44b58b7c-kube-api-access-rqvkv" (OuterVolumeSpecName: "kube-api-access-rqvkv") pod "fb168081-824d-45ef-a815-b96d44b58b7c" (UID: "fb168081-824d-45ef-a815-b96d44b58b7c"). InnerVolumeSpecName "kube-api-access-rqvkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.983340 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-scripts" (OuterVolumeSpecName: "scripts") pod "fb168081-824d-45ef-a815-b96d44b58b7c" (UID: "fb168081-824d-45ef-a815-b96d44b58b7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:40 crc kubenswrapper[4948]: I1204 18:00:40.998968 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb168081-824d-45ef-a815-b96d44b58b7c" (UID: "fb168081-824d-45ef-a815-b96d44b58b7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.006805 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron4bb1-account-delete-4fsjg"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.007013 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4bb1-account-create-update-lzs7x"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.007091 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron4bb1-account-delete-4fsjg"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.019995 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-config-data" (OuterVolumeSpecName: "config-data") pod "fb168081-824d-45ef-a815-b96d44b58b7c" (UID: "fb168081-824d-45ef-a815-b96d44b58b7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.025958 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ml66n"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.032705 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ml66n"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.043096 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-47ce-account-create-update-prrl6"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.044879 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb168081-824d-45ef-a815-b96d44b58b7c" (UID: "fb168081-824d-45ef-a815-b96d44b58b7c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.052772 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-47ce-account-create-update-prrl6"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.058166 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fb168081-824d-45ef-a815-b96d44b58b7c" (UID: "fb168081-824d-45ef-a815-b96d44b58b7c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.058228 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance47ce-account-delete-9mwp2"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.065829 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance47ce-account-delete-9mwp2"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.070077 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.070109 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.070118 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.070127 4948 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.070135 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.070144 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqvkv\" (UniqueName: \"kubernetes.io/projected/fb168081-824d-45ef-a815-b96d44b58b7c-kube-api-access-rqvkv\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.070153 4948 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.070162 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb168081-824d-45ef-a815-b96d44b58b7c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.109534 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tmvtp"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.116722 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tmvtp"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.137652 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder20a5-account-delete-b2bnv"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.146558 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-20a5-account-create-update-9x655"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.153141 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-20a5-account-create-update-9x655"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.168063 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder20a5-account-delete-b2bnv"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.228739 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-hqqtd"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.243024 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-hqqtd"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.253636 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-aee8-account-create-update-cllhj"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.263241 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanaee8-account-delete-r5gkz"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.274158 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicanaee8-account-delete-r5gkz"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.284613 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-aee8-account-create-update-cllhj"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.331538 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-c8pjq"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.338837 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-c8pjq"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.346404 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement7046-account-delete-d78kq"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.346648 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement7046-account-delete-d78kq" podUID="fbfcb6f8-1a5c-4de0-a75a-331dfcb39591" containerName="mariadb-account-delete" containerID="cri-o://3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2" gracePeriod=30 Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.355870 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7046-account-create-update-9zvv4"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.361568 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7046-account-create-update-9zvv4"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.420544 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.475334 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-config\") pod \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.475438 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-combined-ca-bundle\") pod \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.475458 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-ovndb-tls-certs\") pod \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.475525 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-httpd-config\") pod \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.475550 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-internal-tls-certs\") pod \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.475602 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-public-tls-certs\") pod \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.475633 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpdfb\" (UniqueName: \"kubernetes.io/projected/0fc74dcc-f8d8-4852-913a-77cb4526eed7-kube-api-access-mpdfb\") pod \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\" (UID: \"0fc74dcc-f8d8-4852-913a-77cb4526eed7\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.481860 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc74dcc-f8d8-4852-913a-77cb4526eed7-kube-api-access-mpdfb" (OuterVolumeSpecName: "kube-api-access-mpdfb") pod "0fc74dcc-f8d8-4852-913a-77cb4526eed7" (UID: "0fc74dcc-f8d8-4852-913a-77cb4526eed7"). InnerVolumeSpecName "kube-api-access-mpdfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.497182 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0fc74dcc-f8d8-4852-913a-77cb4526eed7" (UID: "0fc74dcc-f8d8-4852-913a-77cb4526eed7"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.519449 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-config" (OuterVolumeSpecName: "config") pod "0fc74dcc-f8d8-4852-913a-77cb4526eed7" (UID: "0fc74dcc-f8d8-4852-913a-77cb4526eed7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.534033 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0fc74dcc-f8d8-4852-913a-77cb4526eed7" (UID: "0fc74dcc-f8d8-4852-913a-77cb4526eed7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.534589 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fc74dcc-f8d8-4852-913a-77cb4526eed7" (UID: "0fc74dcc-f8d8-4852-913a-77cb4526eed7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.545339 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0fc74dcc-f8d8-4852-913a-77cb4526eed7" (UID: "0fc74dcc-f8d8-4852-913a-77cb4526eed7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.571392 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0fc74dcc-f8d8-4852-913a-77cb4526eed7" (UID: "0fc74dcc-f8d8-4852-913a-77cb4526eed7"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.584612 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jwmbk"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.591326 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jwmbk"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.598179 4948 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.598208 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.598218 4948 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.598227 4948 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.598235 4948 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.598243 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpdfb\" (UniqueName: \"kubernetes.io/projected/0fc74dcc-f8d8-4852-913a-77cb4526eed7-kube-api-access-mpdfb\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.598253 4948 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fc74dcc-f8d8-4852-913a-77cb4526eed7-config\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.610829 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0a2da-account-delete-2tst9"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.611077 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell0a2da-account-delete-2tst9" podUID="e1b64e38-8be0-41af-bf89-878d17bbd7a5" containerName="mariadb-account-delete" containerID="cri-o://336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe" gracePeriod=30 Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.645961 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a2da-account-create-update-5wtsm"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.647646 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.670127 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a2da-account-create-update-5wtsm"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.678552 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.711834 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-combined-ca-bundle\") pod \"e905edc7-cd78-48c2-9192-fb18e1d193ac\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.712103 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e905edc7-cd78-48c2-9192-fb18e1d193ac-logs\") pod \"e905edc7-cd78-48c2-9192-fb18e1d193ac\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.712544 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data\") pod \"e905edc7-cd78-48c2-9192-fb18e1d193ac\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.712840 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data-custom\") pod \"e905edc7-cd78-48c2-9192-fb18e1d193ac\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.712890 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqvbt\" (UniqueName: \"kubernetes.io/projected/e905edc7-cd78-48c2-9192-fb18e1d193ac-kube-api-access-fqvbt\") pod \"e905edc7-cd78-48c2-9192-fb18e1d193ac\" (UID: \"e905edc7-cd78-48c2-9192-fb18e1d193ac\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.714565 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e905edc7-cd78-48c2-9192-fb18e1d193ac-logs" (OuterVolumeSpecName: "logs") pod "e905edc7-cd78-48c2-9192-fb18e1d193ac" (UID: "e905edc7-cd78-48c2-9192-fb18e1d193ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.716987 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e905edc7-cd78-48c2-9192-fb18e1d193ac" (UID: "e905edc7-cd78-48c2-9192-fb18e1d193ac"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.717649 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e905edc7-cd78-48c2-9192-fb18e1d193ac-kube-api-access-fqvbt" (OuterVolumeSpecName: "kube-api-access-fqvbt") pod "e905edc7-cd78-48c2-9192-fb18e1d193ac" (UID: "e905edc7-cd78-48c2-9192-fb18e1d193ac"). InnerVolumeSpecName "kube-api-access-fqvbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.719995 4948 generic.go:334] "Generic (PLEG): container finished" podID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerID="86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0" exitCode=0 Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.720117 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7b8cbd95-z6gmw" event={"ID":"0fc74dcc-f8d8-4852-913a-77cb4526eed7","Type":"ContainerDied","Data":"86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0"} Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.720147 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b7b8cbd95-z6gmw" event={"ID":"0fc74dcc-f8d8-4852-913a-77cb4526eed7","Type":"ContainerDied","Data":"2d8100129d8b34199d7f61946d7edb63140d5479620d97db65067c0b26d93c47"} Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.720187 4948 scope.go:117] "RemoveContainer" containerID="7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.720353 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b7b8cbd95-z6gmw" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.739211 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.739651 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" event={"ID":"c94e22e0-c0d1-4233-b21c-9860d204c068","Type":"ContainerDied","Data":"f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186"} Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.741799 4948 generic.go:334] "Generic (PLEG): container finished" podID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerID="f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186" exitCode=0 Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.741964 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dbb7d984c-hzlwz" event={"ID":"c94e22e0-c0d1-4233-b21c-9860d204c068","Type":"ContainerDied","Data":"3d559e92a3f6ede06a2024ce80699b72678702bb302c5dab59e5b0260a46a82b"} Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.762973 4948 scope.go:117] "RemoveContainer" containerID="86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.775324 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e905edc7-cd78-48c2-9192-fb18e1d193ac" (UID: "e905edc7-cd78-48c2-9192-fb18e1d193ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.779159 4948 generic.go:334] "Generic (PLEG): container finished" podID="bbda827a-8528-4b7f-8d4c-70fe8be65d27" containerID="72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" exitCode=0 Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.779395 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbda827a-8528-4b7f-8d4c-70fe8be65d27","Type":"ContainerDied","Data":"72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271"} Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.782302 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.783629 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6b7b8cbd95-z6gmw"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.791033 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data" (OuterVolumeSpecName: "config-data") pod "e905edc7-cd78-48c2-9192-fb18e1d193ac" (UID: "e905edc7-cd78-48c2-9192-fb18e1d193ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.793329 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6b7b8cbd95-z6gmw"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.793580 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54df7858f8-fz456" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.794199 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54df7858f8-fz456" event={"ID":"fb168081-824d-45ef-a815-b96d44b58b7c","Type":"ContainerDied","Data":"cfaeeb113c88db3f384ca716647858c8a39f66434ea7680d9bece6c4039a5959"} Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.794694 4948 scope.go:117] "RemoveContainer" containerID="7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea" Dec 04 18:00:41 crc kubenswrapper[4948]: E1204 18:00:41.795002 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea\": container with ID starting with 7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea not found: ID does not exist" containerID="7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.795055 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea"} err="failed to get container status \"7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea\": rpc error: code = NotFound desc = could not find container \"7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea\": container with ID starting with 7db6e9c0b39a915a4198f954fb2ad9004135559f52f8c7f4dff2e62899397dea not found: ID does not exist" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.795083 4948 scope.go:117] "RemoveContainer" containerID="86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0" Dec 04 18:00:41 crc kubenswrapper[4948]: E1204 18:00:41.795636 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0\": container with ID starting with 86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0 not found: ID does not exist" containerID="86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.795676 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0"} err="failed to get container status \"86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0\": rpc error: code = NotFound desc = could not find container \"86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0\": container with ID starting with 86de3021f1b8291fec1647bee2840334215f1febc06396d18734210e2f6362f0 not found: ID does not exist" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.795700 4948 scope.go:117] "RemoveContainer" containerID="f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.796685 4948 generic.go:334] "Generic (PLEG): container finished" podID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerID="ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c" exitCode=0 Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.796718 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" event={"ID":"e905edc7-cd78-48c2-9192-fb18e1d193ac","Type":"ContainerDied","Data":"ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c"} Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.796750 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.796762 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.796764 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-54fb4df596-9xk9m" event={"ID":"e905edc7-cd78-48c2-9192-fb18e1d193ac","Type":"ContainerDied","Data":"d5cf4dfb906cc721346b4bc7c9c652ff344ea93e5dfdf569189b114e8ff3fcdc"} Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.814771 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxnt2\" (UniqueName: \"kubernetes.io/projected/c94e22e0-c0d1-4233-b21c-9860d204c068-kube-api-access-mxnt2\") pod \"c94e22e0-c0d1-4233-b21c-9860d204c068\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.814878 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-combined-ca-bundle\") pod \"c94e22e0-c0d1-4233-b21c-9860d204c068\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.815501 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data-custom\") pod \"c94e22e0-c0d1-4233-b21c-9860d204c068\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.815564 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94e22e0-c0d1-4233-b21c-9860d204c068-logs\") pod \"c94e22e0-c0d1-4233-b21c-9860d204c068\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.815587 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data\") pod \"c94e22e0-c0d1-4233-b21c-9860d204c068\" (UID: \"c94e22e0-c0d1-4233-b21c-9860d204c068\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.816723 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e905edc7-cd78-48c2-9192-fb18e1d193ac-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.816750 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.816761 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.816774 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqvbt\" (UniqueName: \"kubernetes.io/projected/e905edc7-cd78-48c2-9192-fb18e1d193ac-kube-api-access-fqvbt\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.816785 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e905edc7-cd78-48c2-9192-fb18e1d193ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.827170 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c94e22e0-c0d1-4233-b21c-9860d204c068-logs" (OuterVolumeSpecName: "logs") pod "c94e22e0-c0d1-4233-b21c-9860d204c068" (UID: "c94e22e0-c0d1-4233-b21c-9860d204c068"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.828384 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c94e22e0-c0d1-4233-b21c-9860d204c068" (UID: "c94e22e0-c0d1-4233-b21c-9860d204c068"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.834287 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94e22e0-c0d1-4233-b21c-9860d204c068-kube-api-access-mxnt2" (OuterVolumeSpecName: "kube-api-access-mxnt2") pod "c94e22e0-c0d1-4233-b21c-9860d204c068" (UID: "c94e22e0-c0d1-4233-b21c-9860d204c068"). InnerVolumeSpecName "kube-api-access-mxnt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.847928 4948 scope.go:117] "RemoveContainer" containerID="ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.855360 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c94e22e0-c0d1-4233-b21c-9860d204c068" (UID: "c94e22e0-c0d1-4233-b21c-9860d204c068"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.859905 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-snxlb"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.876440 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-snxlb"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.882995 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.883888 4948 scope.go:117] "RemoveContainer" containerID="f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186" Dec 04 18:00:41 crc kubenswrapper[4948]: E1204 18:00:41.884337 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186\": container with ID starting with f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186 not found: ID does not exist" containerID="f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.884369 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186"} err="failed to get container status \"f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186\": rpc error: code = NotFound desc = could not find container \"f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186\": container with ID starting with f34fe343e2ac3e79caf2690088b40705b59615b199cc21f1bd2c1bcdfc2ee186 not found: ID does not exist" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.884393 4948 scope.go:117] "RemoveContainer" containerID="ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b" Dec 04 18:00:41 crc kubenswrapper[4948]: E1204 18:00:41.884699 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b\": container with ID starting with ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b not found: ID does not exist" containerID="ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.884721 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b"} err="failed to get container status \"ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b\": rpc error: code = NotFound desc = could not find container \"ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b\": container with ID starting with ff8c82e0c8281b7dbb36886ddeb72601823ac11b359bb84bae4e421363ed724b not found: ID does not exist" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.884736 4948 scope.go:117] "RemoveContainer" containerID="42f8d6eb61951e718daf3a1c3876ae5812998b0025aa93db9e52773f8f045f77" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.888305 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.889258 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data" (OuterVolumeSpecName: "config-data") pod "c94e22e0-c0d1-4233-b21c-9860d204c068" (UID: "c94e22e0-c0d1-4233-b21c-9860d204c068"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.893607 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-de93-account-create-update-d8v6q"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.900238 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapide93-account-delete-s9wkh"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.906828 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-de93-account-create-update-d8v6q"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.912359 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-54fb4df596-9xk9m"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.916203 4948 scope.go:117] "RemoveContainer" containerID="ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.917753 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-combined-ca-bundle\") pod \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.917904 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-config-data\") pod \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.917967 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwdn\" (UniqueName: \"kubernetes.io/projected/bbda827a-8528-4b7f-8d4c-70fe8be65d27-kube-api-access-nxwdn\") pod \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\" (UID: \"bbda827a-8528-4b7f-8d4c-70fe8be65d27\") " Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.918310 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapide93-account-delete-s9wkh"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.918403 4948 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.918794 4948 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94e22e0-c0d1-4233-b21c-9860d204c068-logs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.918808 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.918817 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxnt2\" (UniqueName: \"kubernetes.io/projected/c94e22e0-c0d1-4233-b21c-9860d204c068-kube-api-access-mxnt2\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.918827 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94e22e0-c0d1-4233-b21c-9860d204c068-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.921211 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbda827a-8528-4b7f-8d4c-70fe8be65d27-kube-api-access-nxwdn" (OuterVolumeSpecName: "kube-api-access-nxwdn") pod "bbda827a-8528-4b7f-8d4c-70fe8be65d27" (UID: "bbda827a-8528-4b7f-8d4c-70fe8be65d27"). InnerVolumeSpecName "kube-api-access-nxwdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.924737 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-54fb4df596-9xk9m"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.930473 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-54df7858f8-fz456"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.935617 4948 scope.go:117] "RemoveContainer" containerID="b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.936147 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-54df7858f8-fz456"] Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.937613 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-config-data" (OuterVolumeSpecName: "config-data") pod "bbda827a-8528-4b7f-8d4c-70fe8be65d27" (UID: "bbda827a-8528-4b7f-8d4c-70fe8be65d27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.938889 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbda827a-8528-4b7f-8d4c-70fe8be65d27" (UID: "bbda827a-8528-4b7f-8d4c-70fe8be65d27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.951450 4948 scope.go:117] "RemoveContainer" containerID="ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c" Dec 04 18:00:41 crc kubenswrapper[4948]: E1204 18:00:41.952337 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c\": container with ID starting with ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c not found: ID does not exist" containerID="ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.952371 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c"} err="failed to get container status \"ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c\": rpc error: code = NotFound desc = could not find container \"ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c\": container with ID starting with ac98071512cc9335d7708ab26344b02bbcc845b1c60bdbb15b7c2ccdc4c7a68c not found: ID does not exist" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.952396 4948 scope.go:117] "RemoveContainer" containerID="b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d" Dec 04 18:00:41 crc kubenswrapper[4948]: E1204 18:00:41.952706 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d\": container with ID starting with b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d not found: ID does not exist" containerID="b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d" Dec 04 18:00:41 crc kubenswrapper[4948]: I1204 18:00:41.952734 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d"} err="failed to get container status \"b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d\": rpc error: code = NotFound desc = could not find container \"b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d\": container with ID starting with b75eac34bb63e70583dc8636fbdd2e9fb6981e25394b1717d3f51a76c5ecb23d not found: ID does not exist" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.020690 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.020900 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxwdn\" (UniqueName: \"kubernetes.io/projected/bbda827a-8528-4b7f-8d4c-70fe8be65d27-kube-api-access-nxwdn\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.020971 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbda827a-8528-4b7f-8d4c-70fe8be65d27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.137695 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6dbb7d984c-hzlwz"] Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.143549 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6dbb7d984c-hzlwz"] Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.219693 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.164:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 18:00:42 crc kubenswrapper[4948]: E1204 18:00:42.223466 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:42 crc kubenswrapper[4948]: E1204 18:00:42.223495 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:42 crc kubenswrapper[4948]: E1204 18:00:42.223537 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts podName:fbfcb6f8-1a5c-4de0-a75a-331dfcb39591 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:46.223519103 +0000 UTC m=+2057.584593515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts") pod "placement7046-account-delete-d78kq" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591") : configmap "openstack-scripts" not found Dec 04 18:00:42 crc kubenswrapper[4948]: E1204 18:00:42.223557 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts podName:e1b64e38-8be0-41af-bf89-878d17bbd7a5 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:46.223548624 +0000 UTC m=+2057.584623036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts") pod "novacell0a2da-account-delete-2tst9" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5") : configmap "openstack-scripts" not found Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.319167 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.425266 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-run-httpd\") pod \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.425758 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" (UID: "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.425778 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-scripts\") pod \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.426657 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-combined-ca-bundle\") pod \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.426731 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-ceilometer-tls-certs\") pod \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.426879 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqgqt\" (UniqueName: \"kubernetes.io/projected/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-kube-api-access-nqgqt\") pod \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.426954 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-sg-core-conf-yaml\") pod \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.426991 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-log-httpd\") pod \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.427022 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-config-data\") pod \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\" (UID: \"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a\") " Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.427559 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" (UID: "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.427595 4948 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.429361 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-scripts" (OuterVolumeSpecName: "scripts") pod "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" (UID: "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.429569 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-kube-api-access-nqgqt" (OuterVolumeSpecName: "kube-api-access-nqgqt") pod "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" (UID: "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a"). InnerVolumeSpecName "kube-api-access-nqgqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.456105 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" (UID: "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.465199 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" (UID: "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.483291 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" (UID: "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.507793 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-config-data" (OuterVolumeSpecName: "config-data") pod "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" (UID: "82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.529013 4948 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.529076 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqgqt\" (UniqueName: \"kubernetes.io/projected/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-kube-api-access-nqgqt\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.529091 4948 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.529104 4948 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.529115 4948 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.529126 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.529137 4948 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.615255 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.615326 4948 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": dial tcp 10.217.0.204:8775: i/o timeout" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.809818 4948 generic.go:334] "Generic (PLEG): container finished" podID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerID="62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b" exitCode=0 Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.809887 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerDied","Data":"62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b"} Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.809913 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a","Type":"ContainerDied","Data":"b8bd78e1b1d8408886b743e438e9e50bf8297293c1ac257c51005e12ba3eeccb"} Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.809933 4948 scope.go:117] "RemoveContainer" containerID="f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.810034 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.823016 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbda827a-8528-4b7f-8d4c-70fe8be65d27","Type":"ContainerDied","Data":"ceb5d1f3057c688bb7eb89af8f4cba0cd723443d1a2971f88caac2f38d93f692"} Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.823091 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.839415 4948 scope.go:117] "RemoveContainer" containerID="a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.858297 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.867568 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.871609 4948 scope.go:117] "RemoveContainer" containerID="62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.873366 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.879875 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.888851 4948 scope.go:117] "RemoveContainer" containerID="cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.905437 4948 scope.go:117] "RemoveContainer" containerID="f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54" Dec 04 18:00:42 crc kubenswrapper[4948]: E1204 18:00:42.905808 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54\": container with ID starting with f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54 not found: ID does not exist" containerID="f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.905849 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54"} err="failed to get container status \"f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54\": rpc error: code = NotFound desc = could not find container \"f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54\": container with ID starting with f47ff2f3c0c7b87c452a8d220aa94d62cf18f688932c1e3689f769173e7c7d54 not found: ID does not exist" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.905877 4948 scope.go:117] "RemoveContainer" containerID="a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f" Dec 04 18:00:42 crc kubenswrapper[4948]: E1204 18:00:42.906380 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f\": container with ID starting with a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f not found: ID does not exist" containerID="a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.906411 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f"} err="failed to get container status \"a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f\": rpc error: code = NotFound desc = could not find container \"a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f\": container with ID starting with a3c2eb3eaa275cb2c5df12db21ee6ded0b41ca5b81306584b2f2b05453cf948f not found: ID does not exist" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.906444 4948 scope.go:117] "RemoveContainer" containerID="62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b" Dec 04 18:00:42 crc kubenswrapper[4948]: E1204 18:00:42.906691 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b\": container with ID starting with 62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b not found: ID does not exist" containerID="62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.906720 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b"} err="failed to get container status \"62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b\": rpc error: code = NotFound desc = could not find container \"62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b\": container with ID starting with 62c00d4b635bc5784e9e2b3d8970041643249bc295c784f3f2526d4ad6c7323b not found: ID does not exist" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.906735 4948 scope.go:117] "RemoveContainer" containerID="cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a" Dec 04 18:00:42 crc kubenswrapper[4948]: E1204 18:00:42.907023 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a\": container with ID starting with cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a not found: ID does not exist" containerID="cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.907064 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a"} err="failed to get container status \"cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a\": rpc error: code = NotFound desc = could not find container \"cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a\": container with ID starting with cc55b45f86a1229589578fb934421535b95cad293cba91284a2b8aa061c3f44a not found: ID does not exist" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.907080 4948 scope.go:117] "RemoveContainer" containerID="72fb0c09e77745411f3255387fd7d7d1e827291e3506dd1d09f277ed3a0e1271" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.923761 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" path="/var/lib/kubelet/pods/0fc74dcc-f8d8-4852-913a-77cb4526eed7/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.924553 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e5cc30-bac1-418c-af51-af5cb1d8d595" path="/var/lib/kubelet/pods/31e5cc30-bac1-418c-af51-af5cb1d8d595/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.925175 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c87540-53a6-4923-adcb-3af20aa678d1" path="/var/lib/kubelet/pods/42c87540-53a6-4923-adcb-3af20aa678d1/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.926229 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48d8f605-3274-40ec-8a30-8dc188fdcd86" path="/var/lib/kubelet/pods/48d8f605-3274-40ec-8a30-8dc188fdcd86/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.926685 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5917bc-97e7-4fa9-b727-c503d616e67f" path="/var/lib/kubelet/pods/4c5917bc-97e7-4fa9-b727-c503d616e67f/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.927190 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505f7a05-8fe4-4e76-b5ed-45339ebda3dc" path="/var/lib/kubelet/pods/505f7a05-8fe4-4e76-b5ed-45339ebda3dc/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.927642 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59806891-9fa2-446a-87c1-b7efbf4b692b" path="/var/lib/kubelet/pods/59806891-9fa2-446a-87c1-b7efbf4b692b/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.928565 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63125130-8f44-4d42-8fa9-2631c2c3d8ec" path="/var/lib/kubelet/pods/63125130-8f44-4d42-8fa9-2631c2c3d8ec/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.929081 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b5b4ee-a4ce-416c-8807-e5fe61c9c59d" path="/var/lib/kubelet/pods/66b5b4ee-a4ce-416c-8807-e5fe61c9c59d/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.929580 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7744e322-879f-4483-b49e-019fc53973f5" path="/var/lib/kubelet/pods/7744e322-879f-4483-b49e-019fc53973f5/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.930483 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f9ff11-f145-4e76-a9fc-084de8ccb029" path="/var/lib/kubelet/pods/80f9ff11-f145-4e76-a9fc-084de8ccb029/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.930986 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" path="/var/lib/kubelet/pods/82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.932250 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" path="/var/lib/kubelet/pods/90b4baf7-8366-4f47-8515-c33e1b691856/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.932786 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9acee6d3-23af-4793-8e56-8f3fbc169779" path="/var/lib/kubelet/pods/9acee6d3-23af-4793-8e56-8f3fbc169779/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.933271 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fcef00f-3c5c-478a-a9b4-39c07f98ff69" path="/var/lib/kubelet/pods/9fcef00f-3c5c-478a-a9b4-39c07f98ff69/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.933740 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24421a3-5139-4a65-b91e-8915d1b96103" path="/var/lib/kubelet/pods/a24421a3-5139-4a65-b91e-8915d1b96103/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.934697 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b875eb-4f81-407e-b0f1-12086316a557" path="/var/lib/kubelet/pods/b6b875eb-4f81-407e-b0f1-12086316a557/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.935233 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa21dba-653c-4cec-9ecd-09a6e1dfa082" path="/var/lib/kubelet/pods/baa21dba-653c-4cec-9ecd-09a6e1dfa082/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.935689 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbda827a-8528-4b7f-8d4c-70fe8be65d27" path="/var/lib/kubelet/pods/bbda827a-8528-4b7f-8d4c-70fe8be65d27/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.936626 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d49c2c-6474-4667-ba8c-21c2a24e4522" path="/var/lib/kubelet/pods/c2d49c2c-6474-4667-ba8c-21c2a24e4522/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.937163 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d77e21-3036-4810-98ec-1a44a3f882df" path="/var/lib/kubelet/pods/c7d77e21-3036-4810-98ec-1a44a3f882df/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.937658 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" path="/var/lib/kubelet/pods/c94e22e0-c0d1-4233-b21c-9860d204c068/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.938737 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e083d908-b647-4875-8ae1-d455db250897" path="/var/lib/kubelet/pods/e083d908-b647-4875-8ae1-d455db250897/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.939236 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" path="/var/lib/kubelet/pods/e905edc7-cd78-48c2-9192-fb18e1d193ac/volumes" Dec 04 18:00:42 crc kubenswrapper[4948]: I1204 18:00:42.939956 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb168081-824d-45ef-a815-b96d44b58b7c" path="/var/lib/kubelet/pods/fb168081-824d-45ef-a815-b96d44b58b7c/volumes" Dec 04 18:00:43 crc kubenswrapper[4948]: E1204 18:00:43.789265 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:43 crc kubenswrapper[4948]: E1204 18:00:43.790644 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:43 crc kubenswrapper[4948]: E1204 18:00:43.790905 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:43 crc kubenswrapper[4948]: E1204 18:00:43.791363 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:43 crc kubenswrapper[4948]: E1204 18:00:43.791405 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:00:43 crc kubenswrapper[4948]: E1204 18:00:43.792795 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:43 crc kubenswrapper[4948]: E1204 18:00:43.794886 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:43 crc kubenswrapper[4948]: E1204 18:00:43.795271 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:00:46 crc kubenswrapper[4948]: E1204 18:00:46.294655 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:46 crc kubenswrapper[4948]: E1204 18:00:46.294966 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts podName:e1b64e38-8be0-41af-bf89-878d17bbd7a5 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:54.294942246 +0000 UTC m=+2065.656016718 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts") pod "novacell0a2da-account-delete-2tst9" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5") : configmap "openstack-scripts" not found Dec 04 18:00:46 crc kubenswrapper[4948]: E1204 18:00:46.294792 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:46 crc kubenswrapper[4948]: E1204 18:00:46.295015 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts podName:fbfcb6f8-1a5c-4de0-a75a-331dfcb39591 nodeName:}" failed. No retries permitted until 2025-12-04 18:00:54.295008257 +0000 UTC m=+2065.656082659 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts") pod "placement7046-account-delete-d78kq" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591") : configmap "openstack-scripts" not found Dec 04 18:00:48 crc kubenswrapper[4948]: E1204 18:00:48.789589 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:48 crc kubenswrapper[4948]: E1204 18:00:48.791161 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:48 crc kubenswrapper[4948]: E1204 18:00:48.791550 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:48 crc kubenswrapper[4948]: E1204 18:00:48.792169 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:48 crc kubenswrapper[4948]: E1204 18:00:48.792230 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:00:48 crc kubenswrapper[4948]: E1204 18:00:48.792795 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:48 crc kubenswrapper[4948]: E1204 18:00:48.794618 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:48 crc kubenswrapper[4948]: E1204 18:00:48.794898 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:00:53 crc kubenswrapper[4948]: E1204 18:00:53.790187 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:53 crc kubenswrapper[4948]: E1204 18:00:53.791713 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:53 crc kubenswrapper[4948]: E1204 18:00:53.792298 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:53 crc kubenswrapper[4948]: E1204 18:00:53.792367 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:00:53 crc kubenswrapper[4948]: E1204 18:00:53.792353 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:53 crc kubenswrapper[4948]: E1204 18:00:53.796838 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:53 crc kubenswrapper[4948]: E1204 18:00:53.798947 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:53 crc kubenswrapper[4948]: E1204 18:00:53.799141 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:00:54 crc kubenswrapper[4948]: E1204 18:00:54.317302 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:54 crc kubenswrapper[4948]: E1204 18:00:54.317403 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts podName:fbfcb6f8-1a5c-4de0-a75a-331dfcb39591 nodeName:}" failed. No retries permitted until 2025-12-04 18:01:10.317380844 +0000 UTC m=+2081.678455326 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts") pod "placement7046-account-delete-d78kq" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591") : configmap "openstack-scripts" not found Dec 04 18:00:54 crc kubenswrapper[4948]: E1204 18:00:54.317306 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:00:54 crc kubenswrapper[4948]: E1204 18:00:54.317506 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts podName:e1b64e38-8be0-41af-bf89-878d17bbd7a5 nodeName:}" failed. No retries permitted until 2025-12-04 18:01:10.317484167 +0000 UTC m=+2081.678558639 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts") pod "novacell0a2da-account-delete-2tst9" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5") : configmap "openstack-scripts" not found Dec 04 18:00:58 crc kubenswrapper[4948]: E1204 18:00:58.789238 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:58 crc kubenswrapper[4948]: E1204 18:00:58.790697 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:58 crc kubenswrapper[4948]: E1204 18:00:58.791366 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:58 crc kubenswrapper[4948]: E1204 18:00:58.791929 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 18:00:58 crc kubenswrapper[4948]: E1204 18:00:58.792104 4948 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:00:58 crc kubenswrapper[4948]: E1204 18:00:58.792393 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:58 crc kubenswrapper[4948]: E1204 18:00:58.794074 4948 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 18:00:58 crc kubenswrapper[4948]: E1204 18:00:58.794127 4948 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-rzjh8" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.515990 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rzjh8_1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d/ovs-vswitchd/0.log" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.517012 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.647952 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-lib\") pod \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648013 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-run\") pod \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648109 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmjx2\" (UniqueName: \"kubernetes.io/projected/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-kube-api-access-pmjx2\") pod \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648127 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-scripts\") pod \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648152 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-etc-ovs\") pod \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648166 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-log\") pod \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\" (UID: \"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d\") " Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648428 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" (UID: "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648448 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-run" (OuterVolumeSpecName: "var-run") pod "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" (UID: "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648491 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-lib" (OuterVolumeSpecName: "var-lib") pod "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" (UID: "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648495 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-log" (OuterVolumeSpecName: "var-log") pod "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" (UID: "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648776 4948 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648794 4948 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648803 4948 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-log\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.648813 4948 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-var-lib\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.649769 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-scripts" (OuterVolumeSpecName: "scripts") pod "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" (UID: "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.654273 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-kube-api-access-pmjx2" (OuterVolumeSpecName: "kube-api-access-pmjx2") pod "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" (UID: "1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d"). InnerVolumeSpecName "kube-api-access-pmjx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.750301 4948 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.750323 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmjx2\" (UniqueName: \"kubernetes.io/projected/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d-kube-api-access-pmjx2\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:02 crc kubenswrapper[4948]: I1204 18:01:02.929457 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.054723 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") pod \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.054901 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.054942 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-cache\") pod \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.055034 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-567f8\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-kube-api-access-567f8\") pod \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.055150 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-lock\") pod \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\" (UID: \"6bc62dd5-67bd-4e26-bedb-58e1d56abac9\") " Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.055532 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-cache" (OuterVolumeSpecName: "cache") pod "6bc62dd5-67bd-4e26-bedb-58e1d56abac9" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.055764 4948 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-cache\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.056268 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-lock" (OuterVolumeSpecName: "lock") pod "6bc62dd5-67bd-4e26-bedb-58e1d56abac9" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.058574 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-kube-api-access-567f8" (OuterVolumeSpecName: "kube-api-access-567f8") pod "6bc62dd5-67bd-4e26-bedb-58e1d56abac9" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9"). InnerVolumeSpecName "kube-api-access-567f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.058646 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6bc62dd5-67bd-4e26-bedb-58e1d56abac9" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.059581 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "6bc62dd5-67bd-4e26-bedb-58e1d56abac9" (UID: "6bc62dd5-67bd-4e26-bedb-58e1d56abac9"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.087122 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rzjh8_1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d/ovs-vswitchd/0.log" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.088358 4948 generic.go:334] "Generic (PLEG): container finished" podID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" exitCode=137 Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.088445 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rzjh8" event={"ID":"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d","Type":"ContainerDied","Data":"8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d"} Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.088558 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rzjh8" event={"ID":"1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d","Type":"ContainerDied","Data":"ad1e99440685698ea5d9356335de4b56828cb0bac1f1fd589e958d7c34c024d5"} Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.088567 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rzjh8" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.088639 4948 scope.go:117] "RemoveContainer" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.096947 4948 generic.go:334] "Generic (PLEG): container finished" podID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerID="29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d" exitCode=137 Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.096998 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d"} Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.097033 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6bc62dd5-67bd-4e26-bedb-58e1d56abac9","Type":"ContainerDied","Data":"09d33a5cf80f62c8d95761a44c450d2dfb78eb56feb8959b358ccdffeb6f8f27"} Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.097209 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.130021 4948 scope.go:117] "RemoveContainer" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.130226 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-rzjh8"] Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.138863 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-rzjh8"] Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.156861 4948 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-lock\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.156898 4948 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.156921 4948 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.156932 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-567f8\" (UniqueName: \"kubernetes.io/projected/6bc62dd5-67bd-4e26-bedb-58e1d56abac9-kube-api-access-567f8\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.174548 4948 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.177853 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.183503 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.185848 4948 scope.go:117] "RemoveContainer" containerID="d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.218284 4948 scope.go:117] "RemoveContainer" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.218742 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d\": container with ID starting with 8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d not found: ID does not exist" containerID="8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.218771 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d"} err="failed to get container status \"8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d\": rpc error: code = NotFound desc = could not find container \"8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d\": container with ID starting with 8c53dafa09b29dbbff134ec12b1d121701a97a875543bae1ab97983e77f45e8d not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.218792 4948 scope.go:117] "RemoveContainer" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.219480 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0\": container with ID starting with 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 not found: ID does not exist" containerID="8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.219505 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0"} err="failed to get container status \"8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0\": rpc error: code = NotFound desc = could not find container \"8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0\": container with ID starting with 8673dfa45c06fcacf98e39267a5fc7552ad4dd72955f13c9e137d22f476ae8d0 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.219550 4948 scope.go:117] "RemoveContainer" containerID="d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.219953 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02\": container with ID starting with d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02 not found: ID does not exist" containerID="d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.219979 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02"} err="failed to get container status \"d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02\": rpc error: code = NotFound desc = could not find container \"d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02\": container with ID starting with d03347a3e9ee39d0ebc7811422e5a17e710306c642a354e5752004c840539b02 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.219995 4948 scope.go:117] "RemoveContainer" containerID="29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.237036 4948 scope.go:117] "RemoveContainer" containerID="a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.252472 4948 scope.go:117] "RemoveContainer" containerID="ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.258616 4948 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.273488 4948 scope.go:117] "RemoveContainer" containerID="bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.290126 4948 scope.go:117] "RemoveContainer" containerID="bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.306330 4948 scope.go:117] "RemoveContainer" containerID="0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.323838 4948 scope.go:117] "RemoveContainer" containerID="716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.344534 4948 scope.go:117] "RemoveContainer" containerID="cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.363852 4948 scope.go:117] "RemoveContainer" containerID="d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.385387 4948 scope.go:117] "RemoveContainer" containerID="bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.406408 4948 scope.go:117] "RemoveContainer" containerID="014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.424334 4948 scope.go:117] "RemoveContainer" containerID="86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.441167 4948 scope.go:117] "RemoveContainer" containerID="5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.455054 4948 scope.go:117] "RemoveContainer" containerID="07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.469832 4948 scope.go:117] "RemoveContainer" containerID="d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.488837 4948 scope.go:117] "RemoveContainer" containerID="29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.490558 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d\": container with ID starting with 29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d not found: ID does not exist" containerID="29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.490646 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d"} err="failed to get container status \"29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d\": rpc error: code = NotFound desc = could not find container \"29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d\": container with ID starting with 29b62a9b52fdbf9728d7037d4266eea7ed78ffcca5519df979b72ebfd87cd73d not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.490716 4948 scope.go:117] "RemoveContainer" containerID="a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.491077 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347\": container with ID starting with a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347 not found: ID does not exist" containerID="a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.491113 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347"} err="failed to get container status \"a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347\": rpc error: code = NotFound desc = could not find container \"a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347\": container with ID starting with a0698b2b45e7ff080da301a955a3793a3f72d703a866ce848367a61ed1aba347 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.491156 4948 scope.go:117] "RemoveContainer" containerID="ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.491662 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4\": container with ID starting with ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4 not found: ID does not exist" containerID="ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.491708 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4"} err="failed to get container status \"ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4\": rpc error: code = NotFound desc = could not find container \"ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4\": container with ID starting with ac8127bf4c1bf1c013cd9b68f254b2148a40ba30a6783df3e59e6a10a95c98c4 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.491723 4948 scope.go:117] "RemoveContainer" containerID="bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.492256 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd\": container with ID starting with bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd not found: ID does not exist" containerID="bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.492283 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd"} err="failed to get container status \"bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd\": rpc error: code = NotFound desc = could not find container \"bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd\": container with ID starting with bf847bbb855494021f098db5ce0acd61a5f7b006eeb3627d6c9d359c3b115bdd not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.492319 4948 scope.go:117] "RemoveContainer" containerID="bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.493106 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16\": container with ID starting with bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16 not found: ID does not exist" containerID="bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.493136 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16"} err="failed to get container status \"bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16\": rpc error: code = NotFound desc = could not find container \"bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16\": container with ID starting with bd18c9bba959e6306693b354cc5d2fcce59f6648e5d9d3950d80aad18163ad16 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.493151 4948 scope.go:117] "RemoveContainer" containerID="0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.493552 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c\": container with ID starting with 0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c not found: ID does not exist" containerID="0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.493580 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c"} err="failed to get container status \"0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c\": rpc error: code = NotFound desc = could not find container \"0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c\": container with ID starting with 0c64d352a124377dac075599a667ef326a0bd41bc683898babb4c3aa380b459c not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.493594 4948 scope.go:117] "RemoveContainer" containerID="716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.493836 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528\": container with ID starting with 716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528 not found: ID does not exist" containerID="716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.493862 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528"} err="failed to get container status \"716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528\": rpc error: code = NotFound desc = could not find container \"716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528\": container with ID starting with 716695f23d0aebc0a6baf7b48f0c06e28ae0595c2ed5aba2ec0ebb6447bad528 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.493887 4948 scope.go:117] "RemoveContainer" containerID="cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.494131 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a\": container with ID starting with cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a not found: ID does not exist" containerID="cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.494154 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a"} err="failed to get container status \"cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a\": rpc error: code = NotFound desc = could not find container \"cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a\": container with ID starting with cb4e119a671ea966d80ddb3536419e86348c0240a4832eaab9210981f10fb56a not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.494169 4948 scope.go:117] "RemoveContainer" containerID="d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.494404 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77\": container with ID starting with d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77 not found: ID does not exist" containerID="d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.494421 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77"} err="failed to get container status \"d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77\": rpc error: code = NotFound desc = could not find container \"d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77\": container with ID starting with d2032447fd5d763cb064b3b96b3c5bcb8312b02c4d7194401c2a14057306ab77 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.494433 4948 scope.go:117] "RemoveContainer" containerID="bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.494880 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430\": container with ID starting with bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430 not found: ID does not exist" containerID="bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.494930 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430"} err="failed to get container status \"bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430\": rpc error: code = NotFound desc = could not find container \"bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430\": container with ID starting with bc7e638dc0e1fc0d14672a696a5cd25d6a30e774d1382d2c1d5f3dfe6e97d430 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.494964 4948 scope.go:117] "RemoveContainer" containerID="014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.495487 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c\": container with ID starting with 014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c not found: ID does not exist" containerID="014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.495527 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c"} err="failed to get container status \"014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c\": rpc error: code = NotFound desc = could not find container \"014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c\": container with ID starting with 014073f8db13a2189858580ad4268049ceacdeac305fb589c4d684cbc8837a2c not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.495553 4948 scope.go:117] "RemoveContainer" containerID="86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.495942 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069\": container with ID starting with 86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069 not found: ID does not exist" containerID="86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.495980 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069"} err="failed to get container status \"86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069\": rpc error: code = NotFound desc = could not find container \"86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069\": container with ID starting with 86369149bf936876853254b6adc8966ec8dadd291ecbde9f94706e7d926b0069 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.496031 4948 scope.go:117] "RemoveContainer" containerID="5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.496494 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3\": container with ID starting with 5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3 not found: ID does not exist" containerID="5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.496550 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3"} err="failed to get container status \"5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3\": rpc error: code = NotFound desc = could not find container \"5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3\": container with ID starting with 5dfcbc8ec7e81b6e858920d88192b39e0c530064ffe44d395ea9b27aa3f992e3 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.496569 4948 scope.go:117] "RemoveContainer" containerID="07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.497449 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5\": container with ID starting with 07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5 not found: ID does not exist" containerID="07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.497498 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5"} err="failed to get container status \"07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5\": rpc error: code = NotFound desc = could not find container \"07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5\": container with ID starting with 07e72f8c69b1e86ea7aabcfd9ae8c8ad94ca740e196c457216a5a949b0f4b1d5 not found: ID does not exist" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.497514 4948 scope.go:117] "RemoveContainer" containerID="d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a" Dec 04 18:01:03 crc kubenswrapper[4948]: E1204 18:01:03.497877 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a\": container with ID starting with d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a not found: ID does not exist" containerID="d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a" Dec 04 18:01:03 crc kubenswrapper[4948]: I1204 18:01:03.497912 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a"} err="failed to get container status \"d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a\": rpc error: code = NotFound desc = could not find container \"d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a\": container with ID starting with d01c74ecddfb8f5da11005a9e8c194440720a9620fca77d93805e6a50499279a not found: ID does not exist" Dec 04 18:01:04 crc kubenswrapper[4948]: I1204 18:01:04.928191 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" path="/var/lib/kubelet/pods/1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d/volumes" Dec 04 18:01:04 crc kubenswrapper[4948]: I1204 18:01:04.929743 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" path="/var/lib/kubelet/pods/6bc62dd5-67bd-4e26-bedb-58e1d56abac9/volumes" Dec 04 18:01:06 crc kubenswrapper[4948]: I1204 18:01:06.277492 4948 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod10997b06-2476-4c6c-865d-1e5927e75fac"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod10997b06-2476-4c6c-865d-1e5927e75fac] : Timed out while waiting for systemd to remove kubepods-besteffort-pod10997b06_2476_4c6c_865d_1e5927e75fac.slice" Dec 04 18:01:06 crc kubenswrapper[4948]: E1204 18:01:06.277553 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod10997b06-2476-4c6c-865d-1e5927e75fac] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod10997b06-2476-4c6c-865d-1e5927e75fac] : Timed out while waiting for systemd to remove kubepods-besteffort-pod10997b06_2476_4c6c_865d_1e5927e75fac.slice" pod="openstack/openstack-cell1-galera-0" podUID="10997b06-2476-4c6c-865d-1e5927e75fac" Dec 04 18:01:07 crc kubenswrapper[4948]: I1204 18:01:07.139881 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 18:01:07 crc kubenswrapper[4948]: I1204 18:01:07.157568 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 18:01:07 crc kubenswrapper[4948]: I1204 18:01:07.165333 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 18:01:07 crc kubenswrapper[4948]: I1204 18:01:07.713777 4948 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode318bac5-87da-4a9b-9d73-8065c65f4b61"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode318bac5-87da-4a9b-9d73-8065c65f4b61] : Timed out while waiting for systemd to remove kubepods-besteffort-pode318bac5_87da_4a9b_9d73_8065c65f4b61.slice" Dec 04 18:01:07 crc kubenswrapper[4948]: E1204 18:01:07.714087 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pode318bac5-87da-4a9b-9d73-8065c65f4b61] : unable to destroy cgroup paths for cgroup [kubepods besteffort pode318bac5-87da-4a9b-9d73-8065c65f4b61] : Timed out while waiting for systemd to remove kubepods-besteffort-pode318bac5_87da_4a9b_9d73_8065c65f4b61.slice" pod="openstack/nova-cell1-conductor-0" podUID="e318bac5-87da-4a9b-9d73-8065c65f4b61" Dec 04 18:01:08 crc kubenswrapper[4948]: I1204 18:01:08.147958 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 18:01:08 crc kubenswrapper[4948]: I1204 18:01:08.167676 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 18:01:08 crc kubenswrapper[4948]: I1204 18:01:08.173777 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 18:01:08 crc kubenswrapper[4948]: I1204 18:01:08.924214 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10997b06-2476-4c6c-865d-1e5927e75fac" path="/var/lib/kubelet/pods/10997b06-2476-4c6c-865d-1e5927e75fac/volumes" Dec 04 18:01:08 crc kubenswrapper[4948]: I1204 18:01:08.924993 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e318bac5-87da-4a9b-9d73-8065c65f4b61" path="/var/lib/kubelet/pods/e318bac5-87da-4a9b-9d73-8065c65f4b61/volumes" Dec 04 18:01:10 crc kubenswrapper[4948]: E1204 18:01:10.387585 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:01:10 crc kubenswrapper[4948]: E1204 18:01:10.387706 4948 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 18:01:10 crc kubenswrapper[4948]: E1204 18:01:10.387716 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts podName:fbfcb6f8-1a5c-4de0-a75a-331dfcb39591 nodeName:}" failed. No retries permitted until 2025-12-04 18:01:42.387693804 +0000 UTC m=+2113.748768216 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts") pod "placement7046-account-delete-d78kq" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591") : configmap "openstack-scripts" not found Dec 04 18:01:10 crc kubenswrapper[4948]: E1204 18:01:10.387788 4948 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts podName:e1b64e38-8be0-41af-bf89-878d17bbd7a5 nodeName:}" failed. No retries permitted until 2025-12-04 18:01:42.387769056 +0000 UTC m=+2113.748843468 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts") pod "novacell0a2da-account-delete-2tst9" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5") : configmap "openstack-scripts" not found Dec 04 18:01:10 crc kubenswrapper[4948]: I1204 18:01:10.625491 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:01:10 crc kubenswrapper[4948]: I1204 18:01:10.625870 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:01:10 crc kubenswrapper[4948]: I1204 18:01:10.625927 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:01:10 crc kubenswrapper[4948]: I1204 18:01:10.626694 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2179a66ea554870aee48aa7049abeb21ba84072bb2764a52d1cc7c10e4f11e50"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:01:10 crc kubenswrapper[4948]: I1204 18:01:10.626785 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://2179a66ea554870aee48aa7049abeb21ba84072bb2764a52d1cc7c10e4f11e50" gracePeriod=600 Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.205255 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="2179a66ea554870aee48aa7049abeb21ba84072bb2764a52d1cc7c10e4f11e50" exitCode=0 Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.205299 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"2179a66ea554870aee48aa7049abeb21ba84072bb2764a52d1cc7c10e4f11e50"} Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.205592 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7"} Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.205614 4948 scope.go:117] "RemoveContainer" containerID="3d9afdc4950ec84ad277a6df4000b632d2f2e360c34568a890291edf426987e5" Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.738572 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.909082 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b7hg\" (UniqueName: \"kubernetes.io/projected/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-kube-api-access-5b7hg\") pod \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\" (UID: \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\") " Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.909220 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts\") pod \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\" (UID: \"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591\") " Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.910557 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.917028 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-kube-api-access-5b7hg" (OuterVolumeSpecName: "kube-api-access-5b7hg") pod "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591" (UID: "fbfcb6f8-1a5c-4de0-a75a-331dfcb39591"). InnerVolumeSpecName "kube-api-access-5b7hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:01:11 crc kubenswrapper[4948]: I1204 18:01:11.974644 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.011565 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.011871 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b7hg\" (UniqueName: \"kubernetes.io/projected/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591-kube-api-access-5b7hg\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.112972 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts\") pod \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\" (UID: \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\") " Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.113115 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnpft\" (UniqueName: \"kubernetes.io/projected/e1b64e38-8be0-41af-bf89-878d17bbd7a5-kube-api-access-jnpft\") pod \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\" (UID: \"e1b64e38-8be0-41af-bf89-878d17bbd7a5\") " Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.113653 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1b64e38-8be0-41af-bf89-878d17bbd7a5" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.116154 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b64e38-8be0-41af-bf89-878d17bbd7a5-kube-api-access-jnpft" (OuterVolumeSpecName: "kube-api-access-jnpft") pod "e1b64e38-8be0-41af-bf89-878d17bbd7a5" (UID: "e1b64e38-8be0-41af-bf89-878d17bbd7a5"). InnerVolumeSpecName "kube-api-access-jnpft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.215165 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnpft\" (UniqueName: \"kubernetes.io/projected/e1b64e38-8be0-41af-bf89-878d17bbd7a5-kube-api-access-jnpft\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.215207 4948 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1b64e38-8be0-41af-bf89-878d17bbd7a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.222970 4948 generic.go:334] "Generic (PLEG): container finished" podID="e1b64e38-8be0-41af-bf89-878d17bbd7a5" containerID="336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe" exitCode=137 Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.223059 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0a2da-account-delete-2tst9" event={"ID":"e1b64e38-8be0-41af-bf89-878d17bbd7a5","Type":"ContainerDied","Data":"336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe"} Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.223098 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0a2da-account-delete-2tst9" event={"ID":"e1b64e38-8be0-41af-bf89-878d17bbd7a5","Type":"ContainerDied","Data":"df6e49ae2bdee362d0ade5799fa7b2e79b9b2af3709ed1474bfd46451a2158d4"} Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.223117 4948 scope.go:117] "RemoveContainer" containerID="336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.223232 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0a2da-account-delete-2tst9" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.227749 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement7046-account-delete-d78kq" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.227740 4948 generic.go:334] "Generic (PLEG): container finished" podID="fbfcb6f8-1a5c-4de0-a75a-331dfcb39591" containerID="3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2" exitCode=137 Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.227765 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement7046-account-delete-d78kq" event={"ID":"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591","Type":"ContainerDied","Data":"3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2"} Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.227951 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement7046-account-delete-d78kq" event={"ID":"fbfcb6f8-1a5c-4de0-a75a-331dfcb39591","Type":"ContainerDied","Data":"5b30dec60cadf6795af400b00340ae441c03abdc9d849bec81f8c28d4adf020f"} Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.251314 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0a2da-account-delete-2tst9"] Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.251983 4948 scope.go:117] "RemoveContainer" containerID="336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe" Dec 04 18:01:12 crc kubenswrapper[4948]: E1204 18:01:12.252282 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe\": container with ID starting with 336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe not found: ID does not exist" containerID="336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.252308 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe"} err="failed to get container status \"336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe\": rpc error: code = NotFound desc = could not find container \"336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe\": container with ID starting with 336d84a63184a486f9f6a450004a878f76935456f37607cc836ec861f377f4fe not found: ID does not exist" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.252326 4948 scope.go:117] "RemoveContainer" containerID="3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.256194 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0a2da-account-delete-2tst9"] Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.267860 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement7046-account-delete-d78kq"] Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.275576 4948 scope.go:117] "RemoveContainer" containerID="3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.275591 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement7046-account-delete-d78kq"] Dec 04 18:01:12 crc kubenswrapper[4948]: E1204 18:01:12.276108 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2\": container with ID starting with 3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2 not found: ID does not exist" containerID="3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.276156 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2"} err="failed to get container status \"3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2\": rpc error: code = NotFound desc = could not find container \"3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2\": container with ID starting with 3cb298e82a17858371fbac12a4ed9cffecb9d04ebfae96d2e8b776acce27f7c2 not found: ID does not exist" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.925198 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1b64e38-8be0-41af-bf89-878d17bbd7a5" path="/var/lib/kubelet/pods/e1b64e38-8be0-41af-bf89-878d17bbd7a5/volumes" Dec 04 18:01:12 crc kubenswrapper[4948]: I1204 18:01:12.926428 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbfcb6f8-1a5c-4de0-a75a-331dfcb39591" path="/var/lib/kubelet/pods/fbfcb6f8-1a5c-4de0-a75a-331dfcb39591/volumes" Dec 04 18:01:38 crc kubenswrapper[4948]: I1204 18:01:38.463211 4948 scope.go:117] "RemoveContainer" containerID="9d872f42509c6df89e49d65db6b6dc809cb71b73f7f05093b33b930bc565da60" Dec 04 18:01:38 crc kubenswrapper[4948]: I1204 18:01:38.501272 4948 scope.go:117] "RemoveContainer" containerID="13f1ec3c600161183c7b13c25fb8ff3c4a268954e2d40bac4b5524f004c61111" Dec 04 18:01:38 crc kubenswrapper[4948]: I1204 18:01:38.542385 4948 scope.go:117] "RemoveContainer" containerID="afadd5bc8b50ff866da0f039ab345ad38c988c0f86ccd90c03589dbd3fca1a90" Dec 04 18:01:38 crc kubenswrapper[4948]: I1204 18:01:38.577947 4948 scope.go:117] "RemoveContainer" containerID="b6183c62bde6cb6fca075549c3d9d9e84661daae0b28f33907be13b8f3bc5e84" Dec 04 18:01:38 crc kubenswrapper[4948]: I1204 18:01:38.601461 4948 scope.go:117] "RemoveContainer" containerID="4e309c03d554eea0bb3db4cf9ff24a3b3fa6b44b749e0a2339c0c21f05783d2a" Dec 04 18:01:38 crc kubenswrapper[4948]: I1204 18:01:38.625475 4948 scope.go:117] "RemoveContainer" containerID="5111ccd42a2dcb9a24627bf842d9a0b851e3ae53f8f5be34d0dd24d8c4061014" Dec 04 18:01:38 crc kubenswrapper[4948]: I1204 18:01:38.650639 4948 scope.go:117] "RemoveContainer" containerID="82c901cf00202ab9ecd08dc4c09ede1d9fcdc9869bb58784238db83a9b10208f" Dec 04 18:01:38 crc kubenswrapper[4948]: I1204 18:01:38.668990 4948 scope.go:117] "RemoveContainer" containerID="beb87d7c4d42b358a7b2c380c851f944dc0d5a8efb5eaf6f4fed99b0a0bf02b0" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.144754 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9lqht"] Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146027 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146062 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146082 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerName="glance-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146090 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerName="glance-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146107 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="proxy-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146116 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="proxy-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146124 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerName="barbican-worker-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146133 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerName="barbican-worker-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146143 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27244fac-7ff8-4ca0-9002-ef85f78a2564" containerName="galera" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146151 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="27244fac-7ff8-4ca0-9002-ef85f78a2564" containerName="galera" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146163 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146170 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146179 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146185 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146199 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ae0228-b131-4cec-a52f-b5786c22355c" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146207 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ae0228-b131-4cec-a52f-b5786c22355c" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146220 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b64e38-8be0-41af-bf89-878d17bbd7a5" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146226 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b64e38-8be0-41af-bf89-878d17bbd7a5" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146237 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59806891-9fa2-446a-87c1-b7efbf4b692b" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146245 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="59806891-9fa2-446a-87c1-b7efbf4b692b" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146255 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce6fe82-2dcb-49cd-851a-446e66038965" containerName="memcached" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146261 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce6fe82-2dcb-49cd-851a-446e66038965" containerName="memcached" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146267 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="ovn-northd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146273 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="ovn-northd" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146283 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-expirer" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146290 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-expirer" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146302 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d563e18-b478-40af-b4c6-b2dd89ea863a" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146309 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d563e18-b478-40af-b4c6-b2dd89ea863a" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146322 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="ovsdbserver-sb" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146328 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="ovsdbserver-sb" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146335 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="swift-recon-cron" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146341 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="swift-recon-cron" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146349 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="ovsdbserver-nb" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146356 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="ovsdbserver-nb" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146367 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214441b7-69b1-4518-a135-73de11d39a1d" containerName="nova-cell0-conductor-conductor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146373 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="214441b7-69b1-4518-a135-73de11d39a1d" containerName="nova-cell0-conductor-conductor" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146384 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146390 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146401 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-metadata" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146407 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-metadata" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146416 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146422 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146431 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerName="glance-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146436 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerName="glance-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146444 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146451 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-api" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146462 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146467 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146474 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-updater" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146480 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-updater" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146488 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27244fac-7ff8-4ca0-9002-ef85f78a2564" containerName="mysql-bootstrap" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146495 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="27244fac-7ff8-4ca0-9002-ef85f78a2564" containerName="mysql-bootstrap" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146501 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146508 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146519 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerName="cinder-scheduler" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146525 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerName="cinder-scheduler" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146534 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146540 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146552 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerName="placement-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146559 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerName="placement-api" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146569 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80f9ff11-f145-4e76-a9fc-084de8ccb029" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146574 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f9ff11-f145-4e76-a9fc-084de8ccb029" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146584 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="rabbitmq" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146591 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="rabbitmq" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146599 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146626 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146636 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9acee6d3-23af-4793-8e56-8f3fbc169779" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146642 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="9acee6d3-23af-4793-8e56-8f3fbc169779" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146653 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="rsync" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146660 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="rsync" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146668 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="ceilometer-central-agent" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146674 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="ceilometer-central-agent" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146686 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerName="glance-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146692 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerName="glance-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146702 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146708 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-api" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146716 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3326569d-4475-4365-8d93-b2b1522b6f60" containerName="dnsmasq-dns" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146722 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3326569d-4475-4365-8d93-b2b1522b6f60" containerName="dnsmasq-dns" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146731 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbda827a-8528-4b7f-8d4c-70fe8be65d27" containerName="nova-scheduler-scheduler" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146738 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbda827a-8528-4b7f-8d4c-70fe8be65d27" containerName="nova-scheduler-scheduler" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146746 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-reaper" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146752 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-reaper" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146763 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146769 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146777 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" containerName="ovn-controller" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146784 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" containerName="ovn-controller" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146790 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10997b06-2476-4c6c-865d-1e5927e75fac" containerName="galera" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146798 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="10997b06-2476-4c6c-865d-1e5927e75fac" containerName="galera" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146808 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerName="barbican-keystone-listener-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146815 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerName="barbican-keystone-listener-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146827 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146833 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-server" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146841 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e5cc30-bac1-418c-af51-af5cb1d8d595" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146849 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e5cc30-bac1-418c-af51-af5cb1d8d595" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146857 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerName="rabbitmq" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146864 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerName="rabbitmq" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146879 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146886 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146894 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb168081-824d-45ef-a815-b96d44b58b7c" containerName="keystone-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146902 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb168081-824d-45ef-a815-b96d44b58b7c" containerName="keystone-api" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146910 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e318bac5-87da-4a9b-9d73-8065c65f4b61" containerName="nova-cell1-conductor-conductor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146917 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e318bac5-87da-4a9b-9d73-8065c65f4b61" containerName="nova-cell1-conductor-conductor" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146927 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerName="setup-container" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146933 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerName="setup-container" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146939 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="setup-container" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146945 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="setup-container" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146956 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerName="barbican-worker" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146965 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerName="barbican-worker" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146977 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146983 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.146992 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerName="placement-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.146998 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerName="placement-log" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147008 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" containerName="kube-state-metrics" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147014 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" containerName="kube-state-metrics" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147023 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147030 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-server" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147057 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147064 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-server" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147072 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3326569d-4475-4365-8d93-b2b1522b6f60" containerName="init" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147078 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3326569d-4475-4365-8d93-b2b1522b6f60" containerName="init" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147087 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147093 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147102 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-updater" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147109 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-updater" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147118 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="ceilometer-notification-agent" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147125 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="ceilometer-notification-agent" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147133 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerName="probe" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147139 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerName="probe" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147149 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerName="glance-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147154 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerName="glance-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147163 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147170 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-server" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147179 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147185 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147192 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="sg-core" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147199 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="sg-core" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147210 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147216 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147222 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6458efcd-4f47-46a1-92ab-3f1c77035cce" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147229 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6458efcd-4f47-46a1-92ab-3f1c77035cce" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147239 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48d8f605-3274-40ec-8a30-8dc188fdcd86" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147246 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="48d8f605-3274-40ec-8a30-8dc188fdcd86" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147257 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server-init" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147263 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server-init" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147272 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerName="barbican-keystone-listener" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147278 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerName="barbican-keystone-listener" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147286 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147293 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147300 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147305 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147315 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147321 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147330 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10997b06-2476-4c6c-865d-1e5927e75fac" containerName="mysql-bootstrap" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147336 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="10997b06-2476-4c6c-865d-1e5927e75fac" containerName="mysql-bootstrap" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147344 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfcb6f8-1a5c-4de0-a75a-331dfcb39591" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147350 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfcb6f8-1a5c-4de0-a75a-331dfcb39591" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: E1204 18:02:09.147361 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147366 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147531 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerName="glance-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147548 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e318bac5-87da-4a9b-9d73-8065c65f4b61" containerName="nova-cell1-conductor-conductor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147559 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147567 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147576 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerName="barbican-worker" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147583 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfcb6f8-1a5c-4de0-a75a-331dfcb39591" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147594 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147604 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147614 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerName="barbican-keystone-listener-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147630 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="214441b7-69b1-4518-a135-73de11d39a1d" containerName="nova-cell0-conductor-conductor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147642 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-metadata" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147649 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbda827a-8528-4b7f-8d4c-70fe8be65d27" containerName="nova-scheduler-scheduler" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147657 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="59806891-9fa2-446a-87c1-b7efbf4b692b" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147669 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d563e18-b478-40af-b4c6-b2dd89ea863a" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147679 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="80f9ff11-f145-4e76-a9fc-084de8ccb029" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147690 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1070ca-9f1b-4e0a-b0fb-a90a5ecfbadc" containerName="kube-state-metrics" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147696 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovsdb-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147703 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147711 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-expirer" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147721 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerName="placement-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147730 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3326569d-4475-4365-8d93-b2b1522b6f60" containerName="dnsmasq-dns" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147740 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147747 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147755 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b365e8-6c2a-41fe-b50a-1702144d67d4" containerName="ovn-northd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147765 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-updater" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147772 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6458efcd-4f47-46a1-92ab-3f1c77035cce" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147782 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147790 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="ceilometer-central-agent" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147799 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="sg-core" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147809 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="proxy-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147818 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="9acee6d3-23af-4793-8e56-8f3fbc169779" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147827 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147834 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1b64e38-8be0-41af-bf89-878d17bbd7a5" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147841 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="swift-recon-cron" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147848 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerName="glance-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147855 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c881bee3-e2f3-4da4-a12f-00db430e4323" containerName="glance-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147863 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd3a8e9-b3bb-4d04-aaf1-ac66d57e863d" containerName="ovs-vswitchd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147871 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34ca165-31d6-44fa-b175-ed2b1bf9f766" containerName="rabbitmq" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147878 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-updater" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147886 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce6fe82-2dcb-49cd-851a-446e66038965" containerName="memcached" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147896 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad8a42a-8a49-46c9-b4fe-9bb1f95cece1" containerName="ovn-controller" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147902 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="rsync" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147909 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147915 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b408db-1dec-49e0-8212-1193d4fe6a37" containerName="nova-metadata-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147922 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="48d8f605-3274-40ec-8a30-8dc188fdcd86" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147931 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ecb28d-b878-4b16-a46a-9d9be1441aca" containerName="proxy-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147943 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-auditor" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147953 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e905edc7-cd78-48c2-9192-fb18e1d193ac" containerName="barbican-keystone-listener" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147963 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdfb0b25-9afe-4c01-8e54-e9c5adbd59b3" containerName="nova-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147977 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b4baf7-8366-4f47-8515-c33e1b691856" containerName="rabbitmq" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147987 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3e0d09-a01a-4f1c-9fbd-60a23a823e31" containerName="barbican-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.147995 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="117c809e-76fd-458e-acbf-e2f6ce2d2f43" containerName="placement-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148003 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6840a402-94d3-48e6-9ccb-d578573e430a" containerName="ovsdbserver-sb" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148014 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerName="probe" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148025 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148063 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdde2fd-5c98-4b6f-b9a5-a746a454fafd" containerName="cinder-api-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148075 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-reaper" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148084 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ae0228-b131-4cec-a52f-b5786c22355c" containerName="openstack-network-exporter" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148096 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="10997b06-2476-4c6c-865d-1e5927e75fac" containerName="galera" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148136 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e5cc30-bac1-418c-af51-af5cb1d8d595" containerName="mariadb-account-delete" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148150 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148164 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc3ac35-04df-4516-8623-b6a0d855c98a" containerName="ovsdbserver-nb" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148177 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-server" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148187 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94e22e0-c0d1-4233-b21c-9860d204c068" containerName="barbican-worker-log" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148197 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b7f08d-e8bb-4fd7-a3b5-2fa0c94a0d8a" containerName="ceilometer-notification-agent" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148206 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="account-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148214 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="container-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148227 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc62dd5-67bd-4e26-bedb-58e1d56abac9" containerName="object-replicator" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148242 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc74dcc-f8d8-4852-913a-77cb4526eed7" containerName="neutron-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148253 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb168081-824d-45ef-a815-b96d44b58b7c" containerName="keystone-api" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148261 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdac4fb3-a888-4781-b1e0-99630c84fe0f" containerName="cinder-scheduler" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148270 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="27244fac-7ff8-4ca0-9002-ef85f78a2564" containerName="galera" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.148280 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c08574c-af0f-4e7c-81af-b180b29ce4ee" containerName="glance-httpd" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.149672 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.155824 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9lqht"] Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.263859 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkh8p\" (UniqueName: \"kubernetes.io/projected/15821477-3678-486e-ad56-cd285b05f80f-kube-api-access-qkh8p\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.263918 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-utilities\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.263943 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-catalog-content\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.365401 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkh8p\" (UniqueName: \"kubernetes.io/projected/15821477-3678-486e-ad56-cd285b05f80f-kube-api-access-qkh8p\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.365446 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-utilities\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.365475 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-catalog-content\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.366003 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-catalog-content\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.366105 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-utilities\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.390864 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkh8p\" (UniqueName: \"kubernetes.io/projected/15821477-3678-486e-ad56-cd285b05f80f-kube-api-access-qkh8p\") pod \"redhat-operators-9lqht\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.469868 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:09 crc kubenswrapper[4948]: I1204 18:02:09.983632 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9lqht"] Dec 04 18:02:10 crc kubenswrapper[4948]: I1204 18:02:10.824771 4948 generic.go:334] "Generic (PLEG): container finished" podID="15821477-3678-486e-ad56-cd285b05f80f" containerID="ee26ee96c0e95328675b5e343a9001ef860fec778df3408c9fd346ea269fbb2a" exitCode=0 Dec 04 18:02:10 crc kubenswrapper[4948]: I1204 18:02:10.824824 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lqht" event={"ID":"15821477-3678-486e-ad56-cd285b05f80f","Type":"ContainerDied","Data":"ee26ee96c0e95328675b5e343a9001ef860fec778df3408c9fd346ea269fbb2a"} Dec 04 18:02:10 crc kubenswrapper[4948]: I1204 18:02:10.825135 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lqht" event={"ID":"15821477-3678-486e-ad56-cd285b05f80f","Type":"ContainerStarted","Data":"90d511a6abb4ad177c53e42e18b4d951d49cab17f39d8eb93b8b39827621b318"} Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.342765 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gk5v9"] Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.344464 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.358821 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gk5v9"] Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.501848 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtk9b\" (UniqueName: \"kubernetes.io/projected/d901db91-33bd-4cab-b6e1-aa5f341d7446-kube-api-access-xtk9b\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.502303 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-utilities\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.502493 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-catalog-content\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.603676 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtk9b\" (UniqueName: \"kubernetes.io/projected/d901db91-33bd-4cab-b6e1-aa5f341d7446-kube-api-access-xtk9b\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.603732 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-utilities\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.603781 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-catalog-content\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.604369 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-catalog-content\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.604720 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-utilities\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.622573 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtk9b\" (UniqueName: \"kubernetes.io/projected/d901db91-33bd-4cab-b6e1-aa5f341d7446-kube-api-access-xtk9b\") pod \"community-operators-gk5v9\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.681246 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:11 crc kubenswrapper[4948]: I1204 18:02:11.832544 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lqht" event={"ID":"15821477-3678-486e-ad56-cd285b05f80f","Type":"ContainerStarted","Data":"2b449a1f6e00e2339c28e6d56b627ca186d4b2fbbb612f83aab00f31dccce6df"} Dec 04 18:02:12 crc kubenswrapper[4948]: I1204 18:02:12.179971 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gk5v9"] Dec 04 18:02:12 crc kubenswrapper[4948]: W1204 18:02:12.238367 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice/crio-b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8 WatchSource:0}: Error finding container b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8: Status 404 returned error can't find the container with id b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8 Dec 04 18:02:12 crc kubenswrapper[4948]: I1204 18:02:12.842553 4948 generic.go:334] "Generic (PLEG): container finished" podID="15821477-3678-486e-ad56-cd285b05f80f" containerID="2b449a1f6e00e2339c28e6d56b627ca186d4b2fbbb612f83aab00f31dccce6df" exitCode=0 Dec 04 18:02:12 crc kubenswrapper[4948]: I1204 18:02:12.842631 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lqht" event={"ID":"15821477-3678-486e-ad56-cd285b05f80f","Type":"ContainerDied","Data":"2b449a1f6e00e2339c28e6d56b627ca186d4b2fbbb612f83aab00f31dccce6df"} Dec 04 18:02:12 crc kubenswrapper[4948]: I1204 18:02:12.846295 4948 generic.go:334] "Generic (PLEG): container finished" podID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerID="bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124" exitCode=0 Dec 04 18:02:12 crc kubenswrapper[4948]: I1204 18:02:12.846333 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gk5v9" event={"ID":"d901db91-33bd-4cab-b6e1-aa5f341d7446","Type":"ContainerDied","Data":"bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124"} Dec 04 18:02:12 crc kubenswrapper[4948]: I1204 18:02:12.846363 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gk5v9" event={"ID":"d901db91-33bd-4cab-b6e1-aa5f341d7446","Type":"ContainerStarted","Data":"b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8"} Dec 04 18:02:13 crc kubenswrapper[4948]: I1204 18:02:13.855821 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lqht" event={"ID":"15821477-3678-486e-ad56-cd285b05f80f","Type":"ContainerStarted","Data":"2d8a84cb67a3516eaa7d3224407d2036bbabaf468d88c3adde3d31cc7cf1cc16"} Dec 04 18:02:13 crc kubenswrapper[4948]: I1204 18:02:13.857635 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gk5v9" event={"ID":"d901db91-33bd-4cab-b6e1-aa5f341d7446","Type":"ContainerStarted","Data":"226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295"} Dec 04 18:02:13 crc kubenswrapper[4948]: I1204 18:02:13.881001 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9lqht" podStartSLOduration=2.467902418 podStartE2EDuration="4.880980812s" podCreationTimestamp="2025-12-04 18:02:09 +0000 UTC" firstStartedPulling="2025-12-04 18:02:10.829544135 +0000 UTC m=+2142.190618557" lastFinishedPulling="2025-12-04 18:02:13.242622549 +0000 UTC m=+2144.603696951" observedRunningTime="2025-12-04 18:02:13.872008938 +0000 UTC m=+2145.233083340" watchObservedRunningTime="2025-12-04 18:02:13.880980812 +0000 UTC m=+2145.242055214" Dec 04 18:02:14 crc kubenswrapper[4948]: I1204 18:02:14.870437 4948 generic.go:334] "Generic (PLEG): container finished" podID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerID="226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295" exitCode=0 Dec 04 18:02:14 crc kubenswrapper[4948]: I1204 18:02:14.870577 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gk5v9" event={"ID":"d901db91-33bd-4cab-b6e1-aa5f341d7446","Type":"ContainerDied","Data":"226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295"} Dec 04 18:02:15 crc kubenswrapper[4948]: I1204 18:02:15.892610 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gk5v9" event={"ID":"d901db91-33bd-4cab-b6e1-aa5f341d7446","Type":"ContainerStarted","Data":"538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd"} Dec 04 18:02:19 crc kubenswrapper[4948]: I1204 18:02:19.470100 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:19 crc kubenswrapper[4948]: I1204 18:02:19.470582 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:19 crc kubenswrapper[4948]: I1204 18:02:19.521209 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:19 crc kubenswrapper[4948]: I1204 18:02:19.539687 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gk5v9" podStartSLOduration=5.827907609 podStartE2EDuration="8.539671094s" podCreationTimestamp="2025-12-04 18:02:11 +0000 UTC" firstStartedPulling="2025-12-04 18:02:12.847997541 +0000 UTC m=+2144.209071973" lastFinishedPulling="2025-12-04 18:02:15.559761026 +0000 UTC m=+2146.920835458" observedRunningTime="2025-12-04 18:02:15.924128358 +0000 UTC m=+2147.285202840" watchObservedRunningTime="2025-12-04 18:02:19.539671094 +0000 UTC m=+2150.900745496" Dec 04 18:02:19 crc kubenswrapper[4948]: I1204 18:02:19.986153 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:21 crc kubenswrapper[4948]: I1204 18:02:21.682884 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:21 crc kubenswrapper[4948]: I1204 18:02:21.682938 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:21 crc kubenswrapper[4948]: I1204 18:02:21.740712 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:22 crc kubenswrapper[4948]: I1204 18:02:22.004791 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:24 crc kubenswrapper[4948]: I1204 18:02:24.130375 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9lqht"] Dec 04 18:02:24 crc kubenswrapper[4948]: I1204 18:02:24.130954 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9lqht" podUID="15821477-3678-486e-ad56-cd285b05f80f" containerName="registry-server" containerID="cri-o://2d8a84cb67a3516eaa7d3224407d2036bbabaf468d88c3adde3d31cc7cf1cc16" gracePeriod=2 Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.008072 4948 generic.go:334] "Generic (PLEG): container finished" podID="15821477-3678-486e-ad56-cd285b05f80f" containerID="2d8a84cb67a3516eaa7d3224407d2036bbabaf468d88c3adde3d31cc7cf1cc16" exitCode=0 Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.008163 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lqht" event={"ID":"15821477-3678-486e-ad56-cd285b05f80f","Type":"ContainerDied","Data":"2d8a84cb67a3516eaa7d3224407d2036bbabaf468d88c3adde3d31cc7cf1cc16"} Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.241196 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.337573 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkh8p\" (UniqueName: \"kubernetes.io/projected/15821477-3678-486e-ad56-cd285b05f80f-kube-api-access-qkh8p\") pod \"15821477-3678-486e-ad56-cd285b05f80f\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.337691 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-utilities\") pod \"15821477-3678-486e-ad56-cd285b05f80f\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.337768 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-catalog-content\") pod \"15821477-3678-486e-ad56-cd285b05f80f\" (UID: \"15821477-3678-486e-ad56-cd285b05f80f\") " Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.338882 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-utilities" (OuterVolumeSpecName: "utilities") pod "15821477-3678-486e-ad56-cd285b05f80f" (UID: "15821477-3678-486e-ad56-cd285b05f80f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.339064 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.343277 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15821477-3678-486e-ad56-cd285b05f80f-kube-api-access-qkh8p" (OuterVolumeSpecName: "kube-api-access-qkh8p") pod "15821477-3678-486e-ad56-cd285b05f80f" (UID: "15821477-3678-486e-ad56-cd285b05f80f"). InnerVolumeSpecName "kube-api-access-qkh8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.440702 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkh8p\" (UniqueName: \"kubernetes.io/projected/15821477-3678-486e-ad56-cd285b05f80f-kube-api-access-qkh8p\") on node \"crc\" DevicePath \"\"" Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.463481 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15821477-3678-486e-ad56-cd285b05f80f" (UID: "15821477-3678-486e-ad56-cd285b05f80f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:02:27 crc kubenswrapper[4948]: I1204 18:02:27.542515 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15821477-3678-486e-ad56-cd285b05f80f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.019720 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lqht" event={"ID":"15821477-3678-486e-ad56-cd285b05f80f","Type":"ContainerDied","Data":"90d511a6abb4ad177c53e42e18b4d951d49cab17f39d8eb93b8b39827621b318"} Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.019782 4948 scope.go:117] "RemoveContainer" containerID="2d8a84cb67a3516eaa7d3224407d2036bbabaf468d88c3adde3d31cc7cf1cc16" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.019792 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lqht" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.050626 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9lqht"] Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.051014 4948 scope.go:117] "RemoveContainer" containerID="2b449a1f6e00e2339c28e6d56b627ca186d4b2fbbb612f83aab00f31dccce6df" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.061564 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9lqht"] Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.089567 4948 scope.go:117] "RemoveContainer" containerID="ee26ee96c0e95328675b5e343a9001ef860fec778df3408c9fd346ea269fbb2a" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.132625 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gk5v9"] Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.132995 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gk5v9" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerName="registry-server" containerID="cri-o://538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd" gracePeriod=2 Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.497334 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.659249 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-catalog-content\") pod \"d901db91-33bd-4cab-b6e1-aa5f341d7446\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.659322 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtk9b\" (UniqueName: \"kubernetes.io/projected/d901db91-33bd-4cab-b6e1-aa5f341d7446-kube-api-access-xtk9b\") pod \"d901db91-33bd-4cab-b6e1-aa5f341d7446\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.659359 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-utilities\") pod \"d901db91-33bd-4cab-b6e1-aa5f341d7446\" (UID: \"d901db91-33bd-4cab-b6e1-aa5f341d7446\") " Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.660204 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-utilities" (OuterVolumeSpecName: "utilities") pod "d901db91-33bd-4cab-b6e1-aa5f341d7446" (UID: "d901db91-33bd-4cab-b6e1-aa5f341d7446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.663210 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d901db91-33bd-4cab-b6e1-aa5f341d7446-kube-api-access-xtk9b" (OuterVolumeSpecName: "kube-api-access-xtk9b") pod "d901db91-33bd-4cab-b6e1-aa5f341d7446" (UID: "d901db91-33bd-4cab-b6e1-aa5f341d7446"). InnerVolumeSpecName "kube-api-access-xtk9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.714335 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d901db91-33bd-4cab-b6e1-aa5f341d7446" (UID: "d901db91-33bd-4cab-b6e1-aa5f341d7446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.760808 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.760845 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtk9b\" (UniqueName: \"kubernetes.io/projected/d901db91-33bd-4cab-b6e1-aa5f341d7446-kube-api-access-xtk9b\") on node \"crc\" DevicePath \"\"" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.760860 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d901db91-33bd-4cab-b6e1-aa5f341d7446-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:02:28 crc kubenswrapper[4948]: I1204 18:02:28.930776 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15821477-3678-486e-ad56-cd285b05f80f" path="/var/lib/kubelet/pods/15821477-3678-486e-ad56-cd285b05f80f/volumes" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.030996 4948 generic.go:334] "Generic (PLEG): container finished" podID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerID="538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd" exitCode=0 Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.031065 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gk5v9" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.031092 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gk5v9" event={"ID":"d901db91-33bd-4cab-b6e1-aa5f341d7446","Type":"ContainerDied","Data":"538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd"} Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.031500 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gk5v9" event={"ID":"d901db91-33bd-4cab-b6e1-aa5f341d7446","Type":"ContainerDied","Data":"b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8"} Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.031571 4948 scope.go:117] "RemoveContainer" containerID="538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.053744 4948 scope.go:117] "RemoveContainer" containerID="226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.063627 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gk5v9"] Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.074404 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gk5v9"] Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.079103 4948 scope.go:117] "RemoveContainer" containerID="bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.114560 4948 scope.go:117] "RemoveContainer" containerID="538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd" Dec 04 18:02:29 crc kubenswrapper[4948]: E1204 18:02:29.115118 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd\": container with ID starting with 538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd not found: ID does not exist" containerID="538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.115176 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd"} err="failed to get container status \"538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd\": rpc error: code = NotFound desc = could not find container \"538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd\": container with ID starting with 538a121812fb7d22511893da42e65b97de8a539ba34c2966346e7a7ffcb123bd not found: ID does not exist" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.115209 4948 scope.go:117] "RemoveContainer" containerID="226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295" Dec 04 18:02:29 crc kubenswrapper[4948]: E1204 18:02:29.115559 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295\": container with ID starting with 226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295 not found: ID does not exist" containerID="226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.115585 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295"} err="failed to get container status \"226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295\": rpc error: code = NotFound desc = could not find container \"226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295\": container with ID starting with 226aac71427854acc05c1d12c753c0ae4d6f6fb02c174b3d99322386f0f57295 not found: ID does not exist" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.115601 4948 scope.go:117] "RemoveContainer" containerID="bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124" Dec 04 18:02:29 crc kubenswrapper[4948]: E1204 18:02:29.115856 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124\": container with ID starting with bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124 not found: ID does not exist" containerID="bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124" Dec 04 18:02:29 crc kubenswrapper[4948]: I1204 18:02:29.116002 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124"} err="failed to get container status \"bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124\": rpc error: code = NotFound desc = could not find container \"bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124\": container with ID starting with bfaa79a1f7be6f986813443f649a459a20e4c3f5a6ffa1c1c3373bb4fb78b124 not found: ID does not exist" Dec 04 18:02:30 crc kubenswrapper[4948]: I1204 18:02:30.921431 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" path="/var/lib/kubelet/pods/d901db91-33bd-4cab-b6e1-aa5f341d7446/volumes" Dec 04 18:02:36 crc kubenswrapper[4948]: E1204 18:02:36.484700 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice/crio-b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice\": RecentStats: unable to find data in memory cache]" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.144625 4948 scope.go:117] "RemoveContainer" containerID="d9ee9b5e032d4779ab04e1d2c193283f985ed2c160939ade90340fbe03abc118" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.204205 4948 scope.go:117] "RemoveContainer" containerID="9593921e030453b187f6e607d5dbc767989b02108eb3b864ff35cfd587ce4f9a" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.241224 4948 scope.go:117] "RemoveContainer" containerID="579dd03102cd4a05a136c774a013c1043b2a2531695e724eadbbbb097e630ecf" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.290220 4948 scope.go:117] "RemoveContainer" containerID="c8b321349d14a6fd9bd0edb8796b3c0b16d8a5faecb0046b5fad47aeb986e444" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.342410 4948 scope.go:117] "RemoveContainer" containerID="fd03367607ea4bd510783f1e087daf032195a481e4a39c53b6a6ee638bdff39b" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.361380 4948 scope.go:117] "RemoveContainer" containerID="ce3cf731c06ee83c40bae89c0c8e62893dd7be16f5ea71cde48d876fb17f3f41" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.381034 4948 scope.go:117] "RemoveContainer" containerID="31c1e49ead72861127107f90ce0fa37d7e78f909caae6944d8fec5ea338bd72e" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.396637 4948 scope.go:117] "RemoveContainer" containerID="c02365b05662e5ea5bd8a6b35e5c77b94f4aa4dc2c47d4a74dd31d23b02905f2" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.426846 4948 scope.go:117] "RemoveContainer" containerID="09a87ed238333efb556cbdeb3f665192194c7e862fa0cb98f5d2e669778e36e4" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.467190 4948 scope.go:117] "RemoveContainer" containerID="bb1acd9ac03f710a41e03bf6f8bb7f5222fb1c46ba1a6e8e77781d9d9c3dd560" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.488475 4948 scope.go:117] "RemoveContainer" containerID="27d9e1e5f2ef25fdf36a94ebfb879f2251655923eb85fc4f49b2a22dcc40fe16" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.508854 4948 scope.go:117] "RemoveContainer" containerID="584000ffffd365ef10e159d6f513183031557b7f69cda1709b0016cea7426d99" Dec 04 18:02:39 crc kubenswrapper[4948]: I1204 18:02:39.527951 4948 scope.go:117] "RemoveContainer" containerID="41c55b55d495ef9b147a733c4d666ff5ede3c80eb031a735bf7deb9b73dcdf08" Dec 04 18:02:46 crc kubenswrapper[4948]: E1204 18:02:46.674989 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice/crio-b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice\": RecentStats: unable to find data in memory cache]" Dec 04 18:02:56 crc kubenswrapper[4948]: E1204 18:02:56.846474 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice/crio-b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8\": RecentStats: unable to find data in memory cache]" Dec 04 18:03:07 crc kubenswrapper[4948]: E1204 18:03:07.069630 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice/crio-b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8\": RecentStats: unable to find data in memory cache]" Dec 04 18:03:10 crc kubenswrapper[4948]: I1204 18:03:10.625745 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:03:10 crc kubenswrapper[4948]: I1204 18:03:10.626247 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:03:17 crc kubenswrapper[4948]: E1204 18:03:17.269502 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice/crio-b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice\": RecentStats: unable to find data in memory cache]" Dec 04 18:03:27 crc kubenswrapper[4948]: E1204 18:03:27.532018 4948 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice/crio-b31e97ff604c5896f99241dcecbf797794cd8219cd02a1c689ec041bea7321e8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd901db91_33bd_4cab_b6e1_aa5f341d7446.slice\": RecentStats: unable to find data in memory cache]" Dec 04 18:03:39 crc kubenswrapper[4948]: I1204 18:03:39.744811 4948 scope.go:117] "RemoveContainer" containerID="0d71e6bfecc189c1867608db7a4fa58effd809b7c670edfd46414b17cef466f3" Dec 04 18:03:39 crc kubenswrapper[4948]: I1204 18:03:39.786435 4948 scope.go:117] "RemoveContainer" containerID="f64cbf7b6e43ed96dc6f100a33115d3b1f5b0f8fe82df36ecfd2f69eebd3aea8" Dec 04 18:03:39 crc kubenswrapper[4948]: I1204 18:03:39.839458 4948 scope.go:117] "RemoveContainer" containerID="1d1c539929d00b4f50893be637194adb40ec4e3377a6f0bd73cc9431dafdb02f" Dec 04 18:03:39 crc kubenswrapper[4948]: I1204 18:03:39.865801 4948 scope.go:117] "RemoveContainer" containerID="0d9bd5f05df11b25ce7671a425cd507c88b9ea45659b48b77b88a070061bf1ba" Dec 04 18:03:39 crc kubenswrapper[4948]: I1204 18:03:39.885154 4948 scope.go:117] "RemoveContainer" containerID="ff7daf90f4531c9b225c2a333f522204056bff71a75157eece31bc57ae7f99af" Dec 04 18:03:39 crc kubenswrapper[4948]: I1204 18:03:39.906329 4948 scope.go:117] "RemoveContainer" containerID="4e80c18e8fadf5646e4c1128551c4bd8744b986403bb55e6f92e5a0338335034" Dec 04 18:03:40 crc kubenswrapper[4948]: I1204 18:03:40.625188 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:03:40 crc kubenswrapper[4948]: I1204 18:03:40.625255 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.624817 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.625412 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.625463 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.626152 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.626207 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" gracePeriod=600 Dec 04 18:04:10 crc kubenswrapper[4948]: E1204 18:04:10.767797 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.970505 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" exitCode=0 Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.970547 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7"} Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.970583 4948 scope.go:117] "RemoveContainer" containerID="2179a66ea554870aee48aa7049abeb21ba84072bb2764a52d1cc7c10e4f11e50" Dec 04 18:04:10 crc kubenswrapper[4948]: I1204 18:04:10.971189 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:04:10 crc kubenswrapper[4948]: E1204 18:04:10.971561 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:04:25 crc kubenswrapper[4948]: I1204 18:04:25.914323 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:04:25 crc kubenswrapper[4948]: E1204 18:04:25.915183 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:04:39 crc kubenswrapper[4948]: I1204 18:04:39.913749 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:04:39 crc kubenswrapper[4948]: E1204 18:04:39.914981 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.015192 4948 scope.go:117] "RemoveContainer" containerID="9624c3c7c396b5a65af21a0e1c05976f033b2146793a0b81e5ffc20f4e076487" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.080689 4948 scope.go:117] "RemoveContainer" containerID="7d9b8f23ae2ef512cdc6ed34a56f990104bebe106cd1661d09bdc9adbcefb842" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.113755 4948 scope.go:117] "RemoveContainer" containerID="637497b65838d4e1875162878d30bf8895cfbdd36b9fd9f4596de491cb8f3761" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.145825 4948 scope.go:117] "RemoveContainer" containerID="7f8cd0c6abb5d7335ee66173e4a80074451a3871e27a24379b76520ba90371c4" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.171020 4948 scope.go:117] "RemoveContainer" containerID="cb54ba484742820a90bdd62eb825c4a750eea731a1479ae1d68670d2cb64bf30" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.193765 4948 scope.go:117] "RemoveContainer" containerID="d9ba434bb1ea4732ea4f9fd9d3132ecc30802fef6a73cbadd99f622f46b76466" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.214789 4948 scope.go:117] "RemoveContainer" containerID="9bcf71c31ca1e73a965969fb92ddba24201c5c11e6c3cb3b4e92e77ecdd4bf87" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.235668 4948 scope.go:117] "RemoveContainer" containerID="3ed5978b64fee059b95b3f3fcb1a1ab665b53aab15fb25269bdf21eeb866ef81" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.256458 4948 scope.go:117] "RemoveContainer" containerID="db8b9187d0c187cfc911c618a1e41befbb49ce369abd91c26b5274db741964ad" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.283304 4948 scope.go:117] "RemoveContainer" containerID="b058b84e4f67a262a8cae930973840aaf5fda1c3dfc929a04a6794fb308c7d61" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.307069 4948 scope.go:117] "RemoveContainer" containerID="4ae447a7f1fce2c6cfb74358e924d30d23ffd33d65e3ced21c1749bddfe8ce91" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.332218 4948 scope.go:117] "RemoveContainer" containerID="6a4f7ca8f85f0af89c130061e5e621f92b84e9721910c5c82f119b7a393b489d" Dec 04 18:04:40 crc kubenswrapper[4948]: I1204 18:04:40.350218 4948 scope.go:117] "RemoveContainer" containerID="6d7969f01db55d8e0829906eddc67bec05bbf6201f3fa18fbe56db4e90c70181" Dec 04 18:04:50 crc kubenswrapper[4948]: I1204 18:04:50.914109 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:04:50 crc kubenswrapper[4948]: E1204 18:04:50.914938 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:05:04 crc kubenswrapper[4948]: I1204 18:05:04.913934 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:05:04 crc kubenswrapper[4948]: E1204 18:05:04.915036 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:05:19 crc kubenswrapper[4948]: I1204 18:05:19.914353 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:05:19 crc kubenswrapper[4948]: E1204 18:05:19.915238 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:05:32 crc kubenswrapper[4948]: I1204 18:05:32.914984 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:05:32 crc kubenswrapper[4948]: E1204 18:05:32.916154 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:05:40 crc kubenswrapper[4948]: I1204 18:05:40.478906 4948 scope.go:117] "RemoveContainer" containerID="89b87f88c0e902dc36dbe577d822060dd19a3f3fc060e52e6927682aa0513a8c" Dec 04 18:05:40 crc kubenswrapper[4948]: I1204 18:05:40.527902 4948 scope.go:117] "RemoveContainer" containerID="0717fae57b36833de846ba2b339e69c1a155433d662f30df7772e1787eafb4f1" Dec 04 18:05:40 crc kubenswrapper[4948]: I1204 18:05:40.548860 4948 scope.go:117] "RemoveContainer" containerID="23fbbace28d9a40084a9fe5535d672e5b1491a24cd558b5f91e5c18a8993a49e" Dec 04 18:05:40 crc kubenswrapper[4948]: I1204 18:05:40.595796 4948 scope.go:117] "RemoveContainer" containerID="5f19696cdf8f2b7ca40c61e974811c8958a4c4252d9798e3c962a7dc83f23ba1" Dec 04 18:05:47 crc kubenswrapper[4948]: I1204 18:05:47.913380 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:05:47 crc kubenswrapper[4948]: E1204 18:05:47.914111 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:06:00 crc kubenswrapper[4948]: I1204 18:06:00.914400 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:06:00 crc kubenswrapper[4948]: E1204 18:06:00.915363 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:06:12 crc kubenswrapper[4948]: I1204 18:06:12.913933 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:06:12 crc kubenswrapper[4948]: E1204 18:06:12.914808 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:06:26 crc kubenswrapper[4948]: I1204 18:06:26.914447 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:06:26 crc kubenswrapper[4948]: E1204 18:06:26.915521 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:06:38 crc kubenswrapper[4948]: I1204 18:06:38.922249 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:06:38 crc kubenswrapper[4948]: E1204 18:06:38.923445 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:06:40 crc kubenswrapper[4948]: I1204 18:06:40.658600 4948 scope.go:117] "RemoveContainer" containerID="41105a549e6c1574bdd99b2d5b68e9a2cf1e4860742e66d69d7fe198180e35dd" Dec 04 18:06:40 crc kubenswrapper[4948]: I1204 18:06:40.685705 4948 scope.go:117] "RemoveContainer" containerID="aa1b78c1f482914f1113d2ebd5a93ee7b90f349e894f99d3d833add1e7595f33" Dec 04 18:06:40 crc kubenswrapper[4948]: I1204 18:06:40.741890 4948 scope.go:117] "RemoveContainer" containerID="a83c756f93cc2cc4a9dcdfb85cb2483978e21e0e29fa43c794ae8a92f922c6e4" Dec 04 18:06:40 crc kubenswrapper[4948]: I1204 18:06:40.800222 4948 scope.go:117] "RemoveContainer" containerID="ee2f0fda51ef8d33013ab45223908220da2e917552a17842ef747e4792ebb736" Dec 04 18:06:40 crc kubenswrapper[4948]: I1204 18:06:40.842755 4948 scope.go:117] "RemoveContainer" containerID="79e062b08955842c2dc531636633118d7f543daf41922d6d8d1ad534ae81a544" Dec 04 18:06:40 crc kubenswrapper[4948]: I1204 18:06:40.885127 4948 scope.go:117] "RemoveContainer" containerID="7872c1f6fd42a0803b14be61d6958d7a38a6d0fa6968c58defce7378683ea1cc" Dec 04 18:06:53 crc kubenswrapper[4948]: I1204 18:06:53.913331 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:06:53 crc kubenswrapper[4948]: E1204 18:06:53.913931 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:07:05 crc kubenswrapper[4948]: I1204 18:07:05.915835 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:07:05 crc kubenswrapper[4948]: E1204 18:07:05.916724 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:07:20 crc kubenswrapper[4948]: I1204 18:07:20.914435 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:07:20 crc kubenswrapper[4948]: E1204 18:07:20.915628 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:07:33 crc kubenswrapper[4948]: I1204 18:07:33.913872 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:07:33 crc kubenswrapper[4948]: E1204 18:07:33.914735 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:07:44 crc kubenswrapper[4948]: I1204 18:07:44.914421 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:07:44 crc kubenswrapper[4948]: E1204 18:07:44.915304 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:07:56 crc kubenswrapper[4948]: I1204 18:07:56.913445 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:07:56 crc kubenswrapper[4948]: E1204 18:07:56.914181 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:08:10 crc kubenswrapper[4948]: I1204 18:08:10.914290 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:08:10 crc kubenswrapper[4948]: E1204 18:08:10.915183 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:08:24 crc kubenswrapper[4948]: I1204 18:08:24.914450 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:08:24 crc kubenswrapper[4948]: E1204 18:08:24.915451 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:08:39 crc kubenswrapper[4948]: I1204 18:08:39.914391 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:08:39 crc kubenswrapper[4948]: E1204 18:08:39.915432 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:08:51 crc kubenswrapper[4948]: I1204 18:08:51.914012 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:08:51 crc kubenswrapper[4948]: E1204 18:08:51.914989 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:09:05 crc kubenswrapper[4948]: I1204 18:09:05.914601 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:09:05 crc kubenswrapper[4948]: E1204 18:09:05.915659 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:09:16 crc kubenswrapper[4948]: I1204 18:09:16.914196 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:09:17 crc kubenswrapper[4948]: I1204 18:09:17.772623 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"0710806e7d386edb00c5aea1af03d3c98e8a6c744df79b096e254861ac447767"} Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.975683 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hx8dp"] Dec 04 18:09:34 crc kubenswrapper[4948]: E1204 18:09:34.976805 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15821477-3678-486e-ad56-cd285b05f80f" containerName="extract-utilities" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.976823 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="15821477-3678-486e-ad56-cd285b05f80f" containerName="extract-utilities" Dec 04 18:09:34 crc kubenswrapper[4948]: E1204 18:09:34.976849 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerName="registry-server" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.976857 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerName="registry-server" Dec 04 18:09:34 crc kubenswrapper[4948]: E1204 18:09:34.976870 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerName="extract-utilities" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.976877 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerName="extract-utilities" Dec 04 18:09:34 crc kubenswrapper[4948]: E1204 18:09:34.976885 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerName="extract-content" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.976892 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerName="extract-content" Dec 04 18:09:34 crc kubenswrapper[4948]: E1204 18:09:34.976910 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15821477-3678-486e-ad56-cd285b05f80f" containerName="registry-server" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.976916 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="15821477-3678-486e-ad56-cd285b05f80f" containerName="registry-server" Dec 04 18:09:34 crc kubenswrapper[4948]: E1204 18:09:34.976929 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15821477-3678-486e-ad56-cd285b05f80f" containerName="extract-content" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.976935 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="15821477-3678-486e-ad56-cd285b05f80f" containerName="extract-content" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.977134 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="15821477-3678-486e-ad56-cd285b05f80f" containerName="registry-server" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.977153 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="d901db91-33bd-4cab-b6e1-aa5f341d7446" containerName="registry-server" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.978519 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.990855 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-utilities\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.990915 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqbll\" (UniqueName: \"kubernetes.io/projected/c2e13d56-873a-459e-a90b-d787fc899a4c-kube-api-access-dqbll\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.990974 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-catalog-content\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:34 crc kubenswrapper[4948]: I1204 18:09:34.992965 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hx8dp"] Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.091854 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-utilities\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.091899 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqbll\" (UniqueName: \"kubernetes.io/projected/c2e13d56-873a-459e-a90b-d787fc899a4c-kube-api-access-dqbll\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.091931 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-catalog-content\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.092381 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-utilities\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.092433 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-catalog-content\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.117246 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqbll\" (UniqueName: \"kubernetes.io/projected/c2e13d56-873a-459e-a90b-d787fc899a4c-kube-api-access-dqbll\") pod \"certified-operators-hx8dp\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.299832 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.853517 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hx8dp"] Dec 04 18:09:35 crc kubenswrapper[4948]: W1204 18:09:35.857312 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e13d56_873a_459e_a90b_d787fc899a4c.slice/crio-c744337b5c74d0cc54b740a833a1a569417ef1112706f9f681bf6cd2f8f82be2 WatchSource:0}: Error finding container c744337b5c74d0cc54b740a833a1a569417ef1112706f9f681bf6cd2f8f82be2: Status 404 returned error can't find the container with id c744337b5c74d0cc54b740a833a1a569417ef1112706f9f681bf6cd2f8f82be2 Dec 04 18:09:35 crc kubenswrapper[4948]: I1204 18:09:35.923775 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8dp" event={"ID":"c2e13d56-873a-459e-a90b-d787fc899a4c","Type":"ContainerStarted","Data":"c744337b5c74d0cc54b740a833a1a569417ef1112706f9f681bf6cd2f8f82be2"} Dec 04 18:09:36 crc kubenswrapper[4948]: I1204 18:09:36.932576 4948 generic.go:334] "Generic (PLEG): container finished" podID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerID="09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48" exitCode=0 Dec 04 18:09:36 crc kubenswrapper[4948]: I1204 18:09:36.932657 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8dp" event={"ID":"c2e13d56-873a-459e-a90b-d787fc899a4c","Type":"ContainerDied","Data":"09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48"} Dec 04 18:09:36 crc kubenswrapper[4948]: I1204 18:09:36.934608 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 18:09:38 crc kubenswrapper[4948]: I1204 18:09:38.956728 4948 generic.go:334] "Generic (PLEG): container finished" podID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerID="918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e" exitCode=0 Dec 04 18:09:38 crc kubenswrapper[4948]: I1204 18:09:38.956844 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8dp" event={"ID":"c2e13d56-873a-459e-a90b-d787fc899a4c","Type":"ContainerDied","Data":"918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e"} Dec 04 18:09:39 crc kubenswrapper[4948]: I1204 18:09:39.968427 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8dp" event={"ID":"c2e13d56-873a-459e-a90b-d787fc899a4c","Type":"ContainerStarted","Data":"5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5"} Dec 04 18:09:39 crc kubenswrapper[4948]: I1204 18:09:39.986791 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hx8dp" podStartSLOduration=3.564111001 podStartE2EDuration="5.986775501s" podCreationTimestamp="2025-12-04 18:09:34 +0000 UTC" firstStartedPulling="2025-12-04 18:09:36.934288801 +0000 UTC m=+2588.295363203" lastFinishedPulling="2025-12-04 18:09:39.356953301 +0000 UTC m=+2590.718027703" observedRunningTime="2025-12-04 18:09:39.984693131 +0000 UTC m=+2591.345767543" watchObservedRunningTime="2025-12-04 18:09:39.986775501 +0000 UTC m=+2591.347849903" Dec 04 18:09:45 crc kubenswrapper[4948]: I1204 18:09:45.300303 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:45 crc kubenswrapper[4948]: I1204 18:09:45.300821 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:45 crc kubenswrapper[4948]: I1204 18:09:45.347363 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:46 crc kubenswrapper[4948]: I1204 18:09:46.051051 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:46 crc kubenswrapper[4948]: I1204 18:09:46.098381 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hx8dp"] Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.025252 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hx8dp" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerName="registry-server" containerID="cri-o://5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5" gracePeriod=2 Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.402089 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.504479 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-utilities\") pod \"c2e13d56-873a-459e-a90b-d787fc899a4c\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.504545 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqbll\" (UniqueName: \"kubernetes.io/projected/c2e13d56-873a-459e-a90b-d787fc899a4c-kube-api-access-dqbll\") pod \"c2e13d56-873a-459e-a90b-d787fc899a4c\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.504772 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-catalog-content\") pod \"c2e13d56-873a-459e-a90b-d787fc899a4c\" (UID: \"c2e13d56-873a-459e-a90b-d787fc899a4c\") " Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.505497 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-utilities" (OuterVolumeSpecName: "utilities") pod "c2e13d56-873a-459e-a90b-d787fc899a4c" (UID: "c2e13d56-873a-459e-a90b-d787fc899a4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.512035 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e13d56-873a-459e-a90b-d787fc899a4c-kube-api-access-dqbll" (OuterVolumeSpecName: "kube-api-access-dqbll") pod "c2e13d56-873a-459e-a90b-d787fc899a4c" (UID: "c2e13d56-873a-459e-a90b-d787fc899a4c"). InnerVolumeSpecName "kube-api-access-dqbll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.606082 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.606122 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqbll\" (UniqueName: \"kubernetes.io/projected/c2e13d56-873a-459e-a90b-d787fc899a4c-kube-api-access-dqbll\") on node \"crc\" DevicePath \"\"" Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.646838 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2e13d56-873a-459e-a90b-d787fc899a4c" (UID: "c2e13d56-873a-459e-a90b-d787fc899a4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:09:48 crc kubenswrapper[4948]: I1204 18:09:48.707255 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2e13d56-873a-459e-a90b-d787fc899a4c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.034974 4948 generic.go:334] "Generic (PLEG): container finished" podID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerID="5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5" exitCode=0 Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.035064 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx8dp" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.035088 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8dp" event={"ID":"c2e13d56-873a-459e-a90b-d787fc899a4c","Type":"ContainerDied","Data":"5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5"} Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.036459 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx8dp" event={"ID":"c2e13d56-873a-459e-a90b-d787fc899a4c","Type":"ContainerDied","Data":"c744337b5c74d0cc54b740a833a1a569417ef1112706f9f681bf6cd2f8f82be2"} Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.036507 4948 scope.go:117] "RemoveContainer" containerID="5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.068966 4948 scope.go:117] "RemoveContainer" containerID="918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.074275 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hx8dp"] Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.091271 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hx8dp"] Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.102972 4948 scope.go:117] "RemoveContainer" containerID="09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.119225 4948 scope.go:117] "RemoveContainer" containerID="5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5" Dec 04 18:09:49 crc kubenswrapper[4948]: E1204 18:09:49.119636 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5\": container with ID starting with 5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5 not found: ID does not exist" containerID="5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.119723 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5"} err="failed to get container status \"5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5\": rpc error: code = NotFound desc = could not find container \"5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5\": container with ID starting with 5bb81fd170efc3d240310744397c2315e5343324540cc26ef247d2d83f2f70b5 not found: ID does not exist" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.119812 4948 scope.go:117] "RemoveContainer" containerID="918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e" Dec 04 18:09:49 crc kubenswrapper[4948]: E1204 18:09:49.120236 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e\": container with ID starting with 918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e not found: ID does not exist" containerID="918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.120264 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e"} err="failed to get container status \"918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e\": rpc error: code = NotFound desc = could not find container \"918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e\": container with ID starting with 918a8d604106f156f7cb5105ff1fa6fab1d84a8c6e8560725108056d61968f4e not found: ID does not exist" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.120284 4948 scope.go:117] "RemoveContainer" containerID="09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48" Dec 04 18:09:49 crc kubenswrapper[4948]: E1204 18:09:49.120695 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48\": container with ID starting with 09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48 not found: ID does not exist" containerID="09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48" Dec 04 18:09:49 crc kubenswrapper[4948]: I1204 18:09:49.120731 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48"} err="failed to get container status \"09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48\": rpc error: code = NotFound desc = could not find container \"09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48\": container with ID starting with 09a6cb0d8e2b492becc0f8547f5cf026b0e047a6b6c57d4e30b70a97b1e30f48 not found: ID does not exist" Dec 04 18:09:50 crc kubenswrapper[4948]: I1204 18:09:50.922969 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" path="/var/lib/kubelet/pods/c2e13d56-873a-459e-a90b-d787fc899a4c/volumes" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.408975 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nmhz7"] Dec 04 18:10:52 crc kubenswrapper[4948]: E1204 18:10:52.409935 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerName="extract-utilities" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.409956 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerName="extract-utilities" Dec 04 18:10:52 crc kubenswrapper[4948]: E1204 18:10:52.409999 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerName="registry-server" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.410010 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerName="registry-server" Dec 04 18:10:52 crc kubenswrapper[4948]: E1204 18:10:52.410032 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerName="extract-content" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.410065 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerName="extract-content" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.410277 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e13d56-873a-459e-a90b-d787fc899a4c" containerName="registry-server" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.412072 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.446859 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmhz7"] Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.547323 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-catalog-content\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.547595 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2pkt\" (UniqueName: \"kubernetes.io/projected/cc0f9761-7938-4ab0-bbce-2f734122f3e1-kube-api-access-n2pkt\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.547693 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-utilities\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.648862 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-catalog-content\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.648938 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2pkt\" (UniqueName: \"kubernetes.io/projected/cc0f9761-7938-4ab0-bbce-2f734122f3e1-kube-api-access-n2pkt\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.648981 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-utilities\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.649651 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-utilities\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.649740 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-catalog-content\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.686157 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2pkt\" (UniqueName: \"kubernetes.io/projected/cc0f9761-7938-4ab0-bbce-2f734122f3e1-kube-api-access-n2pkt\") pod \"redhat-marketplace-nmhz7\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:52 crc kubenswrapper[4948]: I1204 18:10:52.738333 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:10:53 crc kubenswrapper[4948]: I1204 18:10:53.189980 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmhz7"] Dec 04 18:10:53 crc kubenswrapper[4948]: I1204 18:10:53.610302 4948 generic.go:334] "Generic (PLEG): container finished" podID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerID="445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6" exitCode=0 Dec 04 18:10:53 crc kubenswrapper[4948]: I1204 18:10:53.610357 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmhz7" event={"ID":"cc0f9761-7938-4ab0-bbce-2f734122f3e1","Type":"ContainerDied","Data":"445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6"} Dec 04 18:10:53 crc kubenswrapper[4948]: I1204 18:10:53.610766 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmhz7" event={"ID":"cc0f9761-7938-4ab0-bbce-2f734122f3e1","Type":"ContainerStarted","Data":"7904488e37293b6c4573cce107d0ee2fb9b7429eb8e785e0d69557bf1935f78b"} Dec 04 18:10:55 crc kubenswrapper[4948]: I1204 18:10:55.631620 4948 generic.go:334] "Generic (PLEG): container finished" podID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerID="17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f" exitCode=0 Dec 04 18:10:55 crc kubenswrapper[4948]: I1204 18:10:55.631770 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmhz7" event={"ID":"cc0f9761-7938-4ab0-bbce-2f734122f3e1","Type":"ContainerDied","Data":"17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f"} Dec 04 18:10:56 crc kubenswrapper[4948]: I1204 18:10:56.642503 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmhz7" event={"ID":"cc0f9761-7938-4ab0-bbce-2f734122f3e1","Type":"ContainerStarted","Data":"2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067"} Dec 04 18:10:56 crc kubenswrapper[4948]: I1204 18:10:56.668498 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nmhz7" podStartSLOduration=2.263242535 podStartE2EDuration="4.668481587s" podCreationTimestamp="2025-12-04 18:10:52 +0000 UTC" firstStartedPulling="2025-12-04 18:10:53.611879731 +0000 UTC m=+2664.972954143" lastFinishedPulling="2025-12-04 18:10:56.017118773 +0000 UTC m=+2667.378193195" observedRunningTime="2025-12-04 18:10:56.663928466 +0000 UTC m=+2668.025002868" watchObservedRunningTime="2025-12-04 18:10:56.668481587 +0000 UTC m=+2668.029555989" Dec 04 18:11:02 crc kubenswrapper[4948]: I1204 18:11:02.738648 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:11:02 crc kubenswrapper[4948]: I1204 18:11:02.739111 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:11:02 crc kubenswrapper[4948]: I1204 18:11:02.794854 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:11:03 crc kubenswrapper[4948]: I1204 18:11:03.793490 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:11:03 crc kubenswrapper[4948]: I1204 18:11:03.851673 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmhz7"] Dec 04 18:11:05 crc kubenswrapper[4948]: I1204 18:11:05.729609 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nmhz7" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerName="registry-server" containerID="cri-o://2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067" gracePeriod=2 Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.735887 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.747674 4948 generic.go:334] "Generic (PLEG): container finished" podID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerID="2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067" exitCode=0 Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.747747 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmhz7" event={"ID":"cc0f9761-7938-4ab0-bbce-2f734122f3e1","Type":"ContainerDied","Data":"2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067"} Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.747800 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nmhz7" event={"ID":"cc0f9761-7938-4ab0-bbce-2f734122f3e1","Type":"ContainerDied","Data":"7904488e37293b6c4573cce107d0ee2fb9b7429eb8e785e0d69557bf1935f78b"} Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.747841 4948 scope.go:117] "RemoveContainer" containerID="2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.748096 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nmhz7" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.785518 4948 scope.go:117] "RemoveContainer" containerID="17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.801921 4948 scope.go:117] "RemoveContainer" containerID="445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.832819 4948 scope.go:117] "RemoveContainer" containerID="2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067" Dec 04 18:11:06 crc kubenswrapper[4948]: E1204 18:11:06.833170 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067\": container with ID starting with 2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067 not found: ID does not exist" containerID="2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.833206 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067"} err="failed to get container status \"2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067\": rpc error: code = NotFound desc = could not find container \"2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067\": container with ID starting with 2e27f378bd8f8a6b3225ff56b66daae8a4b89aef3d30686133105af705dc3067 not found: ID does not exist" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.833229 4948 scope.go:117] "RemoveContainer" containerID="17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f" Dec 04 18:11:06 crc kubenswrapper[4948]: E1204 18:11:06.833502 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f\": container with ID starting with 17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f not found: ID does not exist" containerID="17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.833526 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f"} err="failed to get container status \"17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f\": rpc error: code = NotFound desc = could not find container \"17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f\": container with ID starting with 17df898f7e9a818ddc3a77469c8ea791b715e6017c0688718dacedaa8453e44f not found: ID does not exist" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.833542 4948 scope.go:117] "RemoveContainer" containerID="445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6" Dec 04 18:11:06 crc kubenswrapper[4948]: E1204 18:11:06.833815 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6\": container with ID starting with 445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6 not found: ID does not exist" containerID="445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.833834 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6"} err="failed to get container status \"445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6\": rpc error: code = NotFound desc = could not find container \"445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6\": container with ID starting with 445c8aad96d95e5b265c89df8c34a8a33f0a2f30fd87c000ae3a66d9938a6cd6 not found: ID does not exist" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.874298 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-catalog-content\") pod \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.874556 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2pkt\" (UniqueName: \"kubernetes.io/projected/cc0f9761-7938-4ab0-bbce-2f734122f3e1-kube-api-access-n2pkt\") pod \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.874691 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-utilities\") pod \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\" (UID: \"cc0f9761-7938-4ab0-bbce-2f734122f3e1\") " Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.876275 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-utilities" (OuterVolumeSpecName: "utilities") pod "cc0f9761-7938-4ab0-bbce-2f734122f3e1" (UID: "cc0f9761-7938-4ab0-bbce-2f734122f3e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.883250 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc0f9761-7938-4ab0-bbce-2f734122f3e1-kube-api-access-n2pkt" (OuterVolumeSpecName: "kube-api-access-n2pkt") pod "cc0f9761-7938-4ab0-bbce-2f734122f3e1" (UID: "cc0f9761-7938-4ab0-bbce-2f734122f3e1"). InnerVolumeSpecName "kube-api-access-n2pkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.892711 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc0f9761-7938-4ab0-bbce-2f734122f3e1" (UID: "cc0f9761-7938-4ab0-bbce-2f734122f3e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.977538 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.977593 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2pkt\" (UniqueName: \"kubernetes.io/projected/cc0f9761-7938-4ab0-bbce-2f734122f3e1-kube-api-access-n2pkt\") on node \"crc\" DevicePath \"\"" Dec 04 18:11:06 crc kubenswrapper[4948]: I1204 18:11:06.977611 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc0f9761-7938-4ab0-bbce-2f734122f3e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:11:07 crc kubenswrapper[4948]: I1204 18:11:07.078816 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmhz7"] Dec 04 18:11:07 crc kubenswrapper[4948]: I1204 18:11:07.087154 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nmhz7"] Dec 04 18:11:08 crc kubenswrapper[4948]: I1204 18:11:08.930256 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" path="/var/lib/kubelet/pods/cc0f9761-7938-4ab0-bbce-2f734122f3e1/volumes" Dec 04 18:11:40 crc kubenswrapper[4948]: I1204 18:11:40.625414 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:11:40 crc kubenswrapper[4948]: I1204 18:11:40.627017 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:12:10 crc kubenswrapper[4948]: I1204 18:12:10.624670 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:12:10 crc kubenswrapper[4948]: I1204 18:12:10.625276 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:12:40 crc kubenswrapper[4948]: I1204 18:12:40.625301 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:12:40 crc kubenswrapper[4948]: I1204 18:12:40.625750 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:12:40 crc kubenswrapper[4948]: I1204 18:12:40.625808 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:12:40 crc kubenswrapper[4948]: I1204 18:12:40.626537 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0710806e7d386edb00c5aea1af03d3c98e8a6c744df79b096e254861ac447767"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:12:40 crc kubenswrapper[4948]: I1204 18:12:40.626605 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://0710806e7d386edb00c5aea1af03d3c98e8a6c744df79b096e254861ac447767" gracePeriod=600 Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.198680 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4xqc"] Dec 04 18:12:41 crc kubenswrapper[4948]: E1204 18:12:41.200096 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerName="extract-utilities" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.200260 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerName="extract-utilities" Dec 04 18:12:41 crc kubenswrapper[4948]: E1204 18:12:41.200399 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerName="registry-server" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.200537 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerName="registry-server" Dec 04 18:12:41 crc kubenswrapper[4948]: E1204 18:12:41.200694 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerName="extract-content" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.200939 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerName="extract-content" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.201402 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0f9761-7938-4ab0-bbce-2f734122f3e1" containerName="registry-server" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.203382 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.210754 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4xqc"] Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.340460 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-utilities\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.340542 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-catalog-content\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.340599 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndljx\" (UniqueName: \"kubernetes.io/projected/fd54332a-315f-40e9-9809-70a19ebc80df-kube-api-access-ndljx\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.442351 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-catalog-content\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.442438 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndljx\" (UniqueName: \"kubernetes.io/projected/fd54332a-315f-40e9-9809-70a19ebc80df-kube-api-access-ndljx\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.442501 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-utilities\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.442979 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-catalog-content\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.442994 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-utilities\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.466125 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndljx\" (UniqueName: \"kubernetes.io/projected/fd54332a-315f-40e9-9809-70a19ebc80df-kube-api-access-ndljx\") pod \"redhat-operators-h4xqc\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.537131 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.540639 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="0710806e7d386edb00c5aea1af03d3c98e8a6c744df79b096e254861ac447767" exitCode=0 Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.540672 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"0710806e7d386edb00c5aea1af03d3c98e8a6c744df79b096e254861ac447767"} Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.540697 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b"} Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.540712 4948 scope.go:117] "RemoveContainer" containerID="b00f31ece06c320365934fcff9acf3b06f61fd063bd7d3a1a9b48c80c71323c7" Dec 04 18:12:41 crc kubenswrapper[4948]: I1204 18:12:41.963582 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4xqc"] Dec 04 18:12:42 crc kubenswrapper[4948]: I1204 18:12:42.556232 4948 generic.go:334] "Generic (PLEG): container finished" podID="fd54332a-315f-40e9-9809-70a19ebc80df" containerID="3d17588d8652ca10ceda363b706079040deb4af9732c2b0511f1cfea48fdc262" exitCode=0 Dec 04 18:12:42 crc kubenswrapper[4948]: I1204 18:12:42.556320 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4xqc" event={"ID":"fd54332a-315f-40e9-9809-70a19ebc80df","Type":"ContainerDied","Data":"3d17588d8652ca10ceda363b706079040deb4af9732c2b0511f1cfea48fdc262"} Dec 04 18:12:42 crc kubenswrapper[4948]: I1204 18:12:42.556495 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4xqc" event={"ID":"fd54332a-315f-40e9-9809-70a19ebc80df","Type":"ContainerStarted","Data":"cdec0522e25ce00a5819429dab5e31954962e7f38d28438a2ee97bcf65c0f418"} Dec 04 18:12:43 crc kubenswrapper[4948]: I1204 18:12:43.567747 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4xqc" event={"ID":"fd54332a-315f-40e9-9809-70a19ebc80df","Type":"ContainerStarted","Data":"6ae432e6f5d44702a3905714c008e48af810a76ea3dccfac8743edbc922bdece"} Dec 04 18:12:44 crc kubenswrapper[4948]: I1204 18:12:44.578116 4948 generic.go:334] "Generic (PLEG): container finished" podID="fd54332a-315f-40e9-9809-70a19ebc80df" containerID="6ae432e6f5d44702a3905714c008e48af810a76ea3dccfac8743edbc922bdece" exitCode=0 Dec 04 18:12:44 crc kubenswrapper[4948]: I1204 18:12:44.578193 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4xqc" event={"ID":"fd54332a-315f-40e9-9809-70a19ebc80df","Type":"ContainerDied","Data":"6ae432e6f5d44702a3905714c008e48af810a76ea3dccfac8743edbc922bdece"} Dec 04 18:12:45 crc kubenswrapper[4948]: I1204 18:12:45.588359 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4xqc" event={"ID":"fd54332a-315f-40e9-9809-70a19ebc80df","Type":"ContainerStarted","Data":"05bedf7394f2c43fb369ffb3fa4c7e76c67ed7cbd7f4e205e8909bb633bed1e8"} Dec 04 18:12:45 crc kubenswrapper[4948]: I1204 18:12:45.610280 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4xqc" podStartSLOduration=2.191885864 podStartE2EDuration="4.610253601s" podCreationTimestamp="2025-12-04 18:12:41 +0000 UTC" firstStartedPulling="2025-12-04 18:12:42.559148245 +0000 UTC m=+2773.920222657" lastFinishedPulling="2025-12-04 18:12:44.977515952 +0000 UTC m=+2776.338590394" observedRunningTime="2025-12-04 18:12:45.606482833 +0000 UTC m=+2776.967557285" watchObservedRunningTime="2025-12-04 18:12:45.610253601 +0000 UTC m=+2776.971328013" Dec 04 18:12:51 crc kubenswrapper[4948]: I1204 18:12:51.538208 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:51 crc kubenswrapper[4948]: I1204 18:12:51.538972 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:51 crc kubenswrapper[4948]: I1204 18:12:51.618473 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:51 crc kubenswrapper[4948]: I1204 18:12:51.709298 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:51 crc kubenswrapper[4948]: I1204 18:12:51.869975 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4xqc"] Dec 04 18:12:53 crc kubenswrapper[4948]: I1204 18:12:53.657887 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h4xqc" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" containerName="registry-server" containerID="cri-o://05bedf7394f2c43fb369ffb3fa4c7e76c67ed7cbd7f4e205e8909bb633bed1e8" gracePeriod=2 Dec 04 18:12:55 crc kubenswrapper[4948]: I1204 18:12:55.686243 4948 generic.go:334] "Generic (PLEG): container finished" podID="fd54332a-315f-40e9-9809-70a19ebc80df" containerID="05bedf7394f2c43fb369ffb3fa4c7e76c67ed7cbd7f4e205e8909bb633bed1e8" exitCode=0 Dec 04 18:12:55 crc kubenswrapper[4948]: I1204 18:12:55.686360 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4xqc" event={"ID":"fd54332a-315f-40e9-9809-70a19ebc80df","Type":"ContainerDied","Data":"05bedf7394f2c43fb369ffb3fa4c7e76c67ed7cbd7f4e205e8909bb633bed1e8"} Dec 04 18:12:55 crc kubenswrapper[4948]: I1204 18:12:55.910697 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:55 crc kubenswrapper[4948]: I1204 18:12:55.987322 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-catalog-content\") pod \"fd54332a-315f-40e9-9809-70a19ebc80df\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " Dec 04 18:12:55 crc kubenswrapper[4948]: I1204 18:12:55.987430 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndljx\" (UniqueName: \"kubernetes.io/projected/fd54332a-315f-40e9-9809-70a19ebc80df-kube-api-access-ndljx\") pod \"fd54332a-315f-40e9-9809-70a19ebc80df\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " Dec 04 18:12:55 crc kubenswrapper[4948]: I1204 18:12:55.987464 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-utilities\") pod \"fd54332a-315f-40e9-9809-70a19ebc80df\" (UID: \"fd54332a-315f-40e9-9809-70a19ebc80df\") " Dec 04 18:12:55 crc kubenswrapper[4948]: I1204 18:12:55.988503 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-utilities" (OuterVolumeSpecName: "utilities") pod "fd54332a-315f-40e9-9809-70a19ebc80df" (UID: "fd54332a-315f-40e9-9809-70a19ebc80df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:12:55 crc kubenswrapper[4948]: I1204 18:12:55.996521 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd54332a-315f-40e9-9809-70a19ebc80df-kube-api-access-ndljx" (OuterVolumeSpecName: "kube-api-access-ndljx") pod "fd54332a-315f-40e9-9809-70a19ebc80df" (UID: "fd54332a-315f-40e9-9809-70a19ebc80df"). InnerVolumeSpecName "kube-api-access-ndljx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.088811 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndljx\" (UniqueName: \"kubernetes.io/projected/fd54332a-315f-40e9-9809-70a19ebc80df-kube-api-access-ndljx\") on node \"crc\" DevicePath \"\"" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.088851 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.139610 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd54332a-315f-40e9-9809-70a19ebc80df" (UID: "fd54332a-315f-40e9-9809-70a19ebc80df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.189922 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd54332a-315f-40e9-9809-70a19ebc80df-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.696926 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4xqc" event={"ID":"fd54332a-315f-40e9-9809-70a19ebc80df","Type":"ContainerDied","Data":"cdec0522e25ce00a5819429dab5e31954962e7f38d28438a2ee97bcf65c0f418"} Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.697284 4948 scope.go:117] "RemoveContainer" containerID="05bedf7394f2c43fb369ffb3fa4c7e76c67ed7cbd7f4e205e8909bb633bed1e8" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.697094 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4xqc" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.730413 4948 scope.go:117] "RemoveContainer" containerID="6ae432e6f5d44702a3905714c008e48af810a76ea3dccfac8743edbc922bdece" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.753817 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4xqc"] Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.762743 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h4xqc"] Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.777491 4948 scope.go:117] "RemoveContainer" containerID="3d17588d8652ca10ceda363b706079040deb4af9732c2b0511f1cfea48fdc262" Dec 04 18:12:56 crc kubenswrapper[4948]: I1204 18:12:56.928156 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" path="/var/lib/kubelet/pods/fd54332a-315f-40e9-9809-70a19ebc80df/volumes" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.759518 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wc6wc"] Dec 04 18:13:34 crc kubenswrapper[4948]: E1204 18:13:34.760383 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" containerName="registry-server" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.760399 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" containerName="registry-server" Dec 04 18:13:34 crc kubenswrapper[4948]: E1204 18:13:34.760419 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" containerName="extract-utilities" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.760428 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" containerName="extract-utilities" Dec 04 18:13:34 crc kubenswrapper[4948]: E1204 18:13:34.760439 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" containerName="extract-content" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.760446 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" containerName="extract-content" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.760633 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd54332a-315f-40e9-9809-70a19ebc80df" containerName="registry-server" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.761839 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.777365 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wc6wc"] Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.898164 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls9nq\" (UniqueName: \"kubernetes.io/projected/36707fc1-0d42-407d-950b-ee7f42b4dfdc-kube-api-access-ls9nq\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.898285 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-utilities\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.898329 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-catalog-content\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.999390 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls9nq\" (UniqueName: \"kubernetes.io/projected/36707fc1-0d42-407d-950b-ee7f42b4dfdc-kube-api-access-ls9nq\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.999450 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-utilities\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.999477 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-catalog-content\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:34 crc kubenswrapper[4948]: I1204 18:13:34.999971 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-utilities\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:35 crc kubenswrapper[4948]: I1204 18:13:35.000198 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-catalog-content\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:35 crc kubenswrapper[4948]: I1204 18:13:35.021627 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls9nq\" (UniqueName: \"kubernetes.io/projected/36707fc1-0d42-407d-950b-ee7f42b4dfdc-kube-api-access-ls9nq\") pod \"community-operators-wc6wc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:35 crc kubenswrapper[4948]: I1204 18:13:35.083283 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:35 crc kubenswrapper[4948]: I1204 18:13:35.596898 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wc6wc"] Dec 04 18:13:35 crc kubenswrapper[4948]: W1204 18:13:35.606651 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36707fc1_0d42_407d_950b_ee7f42b4dfdc.slice/crio-04666d7def95f64f200dc4d9aae8c3fe4bce33985a02d30a42f2ffbf709d40b7 WatchSource:0}: Error finding container 04666d7def95f64f200dc4d9aae8c3fe4bce33985a02d30a42f2ffbf709d40b7: Status 404 returned error can't find the container with id 04666d7def95f64f200dc4d9aae8c3fe4bce33985a02d30a42f2ffbf709d40b7 Dec 04 18:13:36 crc kubenswrapper[4948]: I1204 18:13:36.081714 4948 generic.go:334] "Generic (PLEG): container finished" podID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerID="ea994cd91f9baa90e3ecd42d4f0ebf605770803202db56a978ef1ee757bcb887" exitCode=0 Dec 04 18:13:36 crc kubenswrapper[4948]: I1204 18:13:36.081849 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc6wc" event={"ID":"36707fc1-0d42-407d-950b-ee7f42b4dfdc","Type":"ContainerDied","Data":"ea994cd91f9baa90e3ecd42d4f0ebf605770803202db56a978ef1ee757bcb887"} Dec 04 18:13:36 crc kubenswrapper[4948]: I1204 18:13:36.082196 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc6wc" event={"ID":"36707fc1-0d42-407d-950b-ee7f42b4dfdc","Type":"ContainerStarted","Data":"04666d7def95f64f200dc4d9aae8c3fe4bce33985a02d30a42f2ffbf709d40b7"} Dec 04 18:13:37 crc kubenswrapper[4948]: I1204 18:13:37.094778 4948 generic.go:334] "Generic (PLEG): container finished" podID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerID="62d0dc724a2ba9607c7ce9a29d8e6db18c09cde9fb954a40a648a39dbee3346f" exitCode=0 Dec 04 18:13:37 crc kubenswrapper[4948]: I1204 18:13:37.094828 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc6wc" event={"ID":"36707fc1-0d42-407d-950b-ee7f42b4dfdc","Type":"ContainerDied","Data":"62d0dc724a2ba9607c7ce9a29d8e6db18c09cde9fb954a40a648a39dbee3346f"} Dec 04 18:13:38 crc kubenswrapper[4948]: I1204 18:13:38.106864 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc6wc" event={"ID":"36707fc1-0d42-407d-950b-ee7f42b4dfdc","Type":"ContainerStarted","Data":"c69e95693f503f152c10fe0c0e9e608cfa189a93eb19dd7a0b5f2626cf89cd72"} Dec 04 18:13:38 crc kubenswrapper[4948]: I1204 18:13:38.133977 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wc6wc" podStartSLOduration=2.724935897 podStartE2EDuration="4.133956532s" podCreationTimestamp="2025-12-04 18:13:34 +0000 UTC" firstStartedPulling="2025-12-04 18:13:36.084899241 +0000 UTC m=+2827.445973673" lastFinishedPulling="2025-12-04 18:13:37.493919906 +0000 UTC m=+2828.854994308" observedRunningTime="2025-12-04 18:13:38.124153814 +0000 UTC m=+2829.485228236" watchObservedRunningTime="2025-12-04 18:13:38.133956532 +0000 UTC m=+2829.495030944" Dec 04 18:13:45 crc kubenswrapper[4948]: I1204 18:13:45.083501 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:45 crc kubenswrapper[4948]: I1204 18:13:45.084491 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:45 crc kubenswrapper[4948]: I1204 18:13:45.149451 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:45 crc kubenswrapper[4948]: I1204 18:13:45.213036 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:45 crc kubenswrapper[4948]: I1204 18:13:45.396643 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wc6wc"] Dec 04 18:13:47 crc kubenswrapper[4948]: I1204 18:13:47.183588 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wc6wc" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerName="registry-server" containerID="cri-o://c69e95693f503f152c10fe0c0e9e608cfa189a93eb19dd7a0b5f2626cf89cd72" gracePeriod=2 Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.194746 4948 generic.go:334] "Generic (PLEG): container finished" podID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerID="c69e95693f503f152c10fe0c0e9e608cfa189a93eb19dd7a0b5f2626cf89cd72" exitCode=0 Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.194822 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc6wc" event={"ID":"36707fc1-0d42-407d-950b-ee7f42b4dfdc","Type":"ContainerDied","Data":"c69e95693f503f152c10fe0c0e9e608cfa189a93eb19dd7a0b5f2626cf89cd72"} Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.195248 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wc6wc" event={"ID":"36707fc1-0d42-407d-950b-ee7f42b4dfdc","Type":"ContainerDied","Data":"04666d7def95f64f200dc4d9aae8c3fe4bce33985a02d30a42f2ffbf709d40b7"} Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.195265 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04666d7def95f64f200dc4d9aae8c3fe4bce33985a02d30a42f2ffbf709d40b7" Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.198747 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.225679 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls9nq\" (UniqueName: \"kubernetes.io/projected/36707fc1-0d42-407d-950b-ee7f42b4dfdc-kube-api-access-ls9nq\") pod \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.225756 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-utilities\") pod \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.225865 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-catalog-content\") pod \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\" (UID: \"36707fc1-0d42-407d-950b-ee7f42b4dfdc\") " Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.226738 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-utilities" (OuterVolumeSpecName: "utilities") pod "36707fc1-0d42-407d-950b-ee7f42b4dfdc" (UID: "36707fc1-0d42-407d-950b-ee7f42b4dfdc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.232277 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36707fc1-0d42-407d-950b-ee7f42b4dfdc-kube-api-access-ls9nq" (OuterVolumeSpecName: "kube-api-access-ls9nq") pod "36707fc1-0d42-407d-950b-ee7f42b4dfdc" (UID: "36707fc1-0d42-407d-950b-ee7f42b4dfdc"). InnerVolumeSpecName "kube-api-access-ls9nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.294284 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36707fc1-0d42-407d-950b-ee7f42b4dfdc" (UID: "36707fc1-0d42-407d-950b-ee7f42b4dfdc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.327708 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls9nq\" (UniqueName: \"kubernetes.io/projected/36707fc1-0d42-407d-950b-ee7f42b4dfdc-kube-api-access-ls9nq\") on node \"crc\" DevicePath \"\"" Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.327770 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:13:48 crc kubenswrapper[4948]: I1204 18:13:48.327798 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36707fc1-0d42-407d-950b-ee7f42b4dfdc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:13:49 crc kubenswrapper[4948]: I1204 18:13:49.204526 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wc6wc" Dec 04 18:13:49 crc kubenswrapper[4948]: I1204 18:13:49.235209 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wc6wc"] Dec 04 18:13:49 crc kubenswrapper[4948]: I1204 18:13:49.243346 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wc6wc"] Dec 04 18:13:50 crc kubenswrapper[4948]: I1204 18:13:50.931427 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" path="/var/lib/kubelet/pods/36707fc1-0d42-407d-950b-ee7f42b4dfdc/volumes" Dec 04 18:14:40 crc kubenswrapper[4948]: I1204 18:14:40.624460 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:14:40 crc kubenswrapper[4948]: I1204 18:14:40.625433 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.147867 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww"] Dec 04 18:15:00 crc kubenswrapper[4948]: E1204 18:15:00.149995 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerName="extract-content" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.150138 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerName="extract-content" Dec 04 18:15:00 crc kubenswrapper[4948]: E1204 18:15:00.150274 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerName="registry-server" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.150364 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerName="registry-server" Dec 04 18:15:00 crc kubenswrapper[4948]: E1204 18:15:00.150464 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerName="extract-utilities" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.150545 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerName="extract-utilities" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.150823 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="36707fc1-0d42-407d-950b-ee7f42b4dfdc" containerName="registry-server" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.151499 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.153489 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.153546 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.158179 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww"] Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.233471 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0039de0-35fb-4079-81a8-d502073646f7-config-volume\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.233580 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0039de0-35fb-4079-81a8-d502073646f7-secret-volume\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.233696 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77vt9\" (UniqueName: \"kubernetes.io/projected/b0039de0-35fb-4079-81a8-d502073646f7-kube-api-access-77vt9\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.334843 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0039de0-35fb-4079-81a8-d502073646f7-secret-volume\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.335449 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77vt9\" (UniqueName: \"kubernetes.io/projected/b0039de0-35fb-4079-81a8-d502073646f7-kube-api-access-77vt9\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.335683 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0039de0-35fb-4079-81a8-d502073646f7-config-volume\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.337641 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0039de0-35fb-4079-81a8-d502073646f7-config-volume\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.340493 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0039de0-35fb-4079-81a8-d502073646f7-secret-volume\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.358327 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77vt9\" (UniqueName: \"kubernetes.io/projected/b0039de0-35fb-4079-81a8-d502073646f7-kube-api-access-77vt9\") pod \"collect-profiles-29414535-lk7ww\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.500758 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:00 crc kubenswrapper[4948]: I1204 18:15:00.986139 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww"] Dec 04 18:15:01 crc kubenswrapper[4948]: I1204 18:15:01.849618 4948 generic.go:334] "Generic (PLEG): container finished" podID="b0039de0-35fb-4079-81a8-d502073646f7" containerID="8906b592fcd969d924a5bab5f2cc38953829faef436db60819607aa46fc0a746" exitCode=0 Dec 04 18:15:01 crc kubenswrapper[4948]: I1204 18:15:01.849730 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" event={"ID":"b0039de0-35fb-4079-81a8-d502073646f7","Type":"ContainerDied","Data":"8906b592fcd969d924a5bab5f2cc38953829faef436db60819607aa46fc0a746"} Dec 04 18:15:01 crc kubenswrapper[4948]: I1204 18:15:01.850269 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" event={"ID":"b0039de0-35fb-4079-81a8-d502073646f7","Type":"ContainerStarted","Data":"653c4a12d51e799bac51136a9f8301151b26c03cd556167a4769778af372d05d"} Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.159933 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.181410 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77vt9\" (UniqueName: \"kubernetes.io/projected/b0039de0-35fb-4079-81a8-d502073646f7-kube-api-access-77vt9\") pod \"b0039de0-35fb-4079-81a8-d502073646f7\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.181554 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0039de0-35fb-4079-81a8-d502073646f7-config-volume\") pod \"b0039de0-35fb-4079-81a8-d502073646f7\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.181603 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0039de0-35fb-4079-81a8-d502073646f7-secret-volume\") pod \"b0039de0-35fb-4079-81a8-d502073646f7\" (UID: \"b0039de0-35fb-4079-81a8-d502073646f7\") " Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.182973 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0039de0-35fb-4079-81a8-d502073646f7-config-volume" (OuterVolumeSpecName: "config-volume") pod "b0039de0-35fb-4079-81a8-d502073646f7" (UID: "b0039de0-35fb-4079-81a8-d502073646f7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.188872 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0039de0-35fb-4079-81a8-d502073646f7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b0039de0-35fb-4079-81a8-d502073646f7" (UID: "b0039de0-35fb-4079-81a8-d502073646f7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.194809 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0039de0-35fb-4079-81a8-d502073646f7-kube-api-access-77vt9" (OuterVolumeSpecName: "kube-api-access-77vt9") pod "b0039de0-35fb-4079-81a8-d502073646f7" (UID: "b0039de0-35fb-4079-81a8-d502073646f7"). InnerVolumeSpecName "kube-api-access-77vt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.283347 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77vt9\" (UniqueName: \"kubernetes.io/projected/b0039de0-35fb-4079-81a8-d502073646f7-kube-api-access-77vt9\") on node \"crc\" DevicePath \"\"" Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.283400 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b0039de0-35fb-4079-81a8-d502073646f7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.283417 4948 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b0039de0-35fb-4079-81a8-d502073646f7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.868689 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" event={"ID":"b0039de0-35fb-4079-81a8-d502073646f7","Type":"ContainerDied","Data":"653c4a12d51e799bac51136a9f8301151b26c03cd556167a4769778af372d05d"} Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.868751 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="653c4a12d51e799bac51136a9f8301151b26c03cd556167a4769778af372d05d" Dec 04 18:15:03 crc kubenswrapper[4948]: I1204 18:15:03.868779 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww" Dec 04 18:15:04 crc kubenswrapper[4948]: I1204 18:15:04.255843 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk"] Dec 04 18:15:04 crc kubenswrapper[4948]: I1204 18:15:04.260517 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414490-fzmqk"] Dec 04 18:15:04 crc kubenswrapper[4948]: I1204 18:15:04.922597 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e652828-b844-4fa8-8e4d-54726614f646" path="/var/lib/kubelet/pods/3e652828-b844-4fa8-8e4d-54726614f646/volumes" Dec 04 18:15:10 crc kubenswrapper[4948]: I1204 18:15:10.624561 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:15:10 crc kubenswrapper[4948]: I1204 18:15:10.624912 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:15:40 crc kubenswrapper[4948]: I1204 18:15:40.624485 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:15:40 crc kubenswrapper[4948]: I1204 18:15:40.625228 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:15:40 crc kubenswrapper[4948]: I1204 18:15:40.625281 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:15:40 crc kubenswrapper[4948]: I1204 18:15:40.626114 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:15:40 crc kubenswrapper[4948]: I1204 18:15:40.626175 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" gracePeriod=600 Dec 04 18:15:40 crc kubenswrapper[4948]: E1204 18:15:40.758986 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:15:41 crc kubenswrapper[4948]: I1204 18:15:41.215903 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" exitCode=0 Dec 04 18:15:41 crc kubenswrapper[4948]: I1204 18:15:41.216020 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b"} Dec 04 18:15:41 crc kubenswrapper[4948]: I1204 18:15:41.216121 4948 scope.go:117] "RemoveContainer" containerID="0710806e7d386edb00c5aea1af03d3c98e8a6c744df79b096e254861ac447767" Dec 04 18:15:41 crc kubenswrapper[4948]: I1204 18:15:41.216989 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:15:41 crc kubenswrapper[4948]: E1204 18:15:41.217533 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:15:41 crc kubenswrapper[4948]: I1204 18:15:41.248821 4948 scope.go:117] "RemoveContainer" containerID="71402d013c6c0349294925fbd5f06e2b6982a5197a16e9fa95fe6c800ba75efa" Dec 04 18:15:52 crc kubenswrapper[4948]: I1204 18:15:52.913316 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:15:52 crc kubenswrapper[4948]: E1204 18:15:52.914155 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:16:03 crc kubenswrapper[4948]: I1204 18:16:03.914349 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:16:03 crc kubenswrapper[4948]: E1204 18:16:03.915103 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:16:17 crc kubenswrapper[4948]: I1204 18:16:17.914253 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:16:17 crc kubenswrapper[4948]: E1204 18:16:17.914968 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:16:31 crc kubenswrapper[4948]: I1204 18:16:31.914389 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:16:31 crc kubenswrapper[4948]: E1204 18:16:31.915446 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:16:43 crc kubenswrapper[4948]: I1204 18:16:43.913566 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:16:43 crc kubenswrapper[4948]: E1204 18:16:43.914321 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:16:55 crc kubenswrapper[4948]: I1204 18:16:55.915188 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:16:55 crc kubenswrapper[4948]: E1204 18:16:55.916007 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:17:06 crc kubenswrapper[4948]: I1204 18:17:06.914596 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:17:06 crc kubenswrapper[4948]: E1204 18:17:06.918399 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:17:20 crc kubenswrapper[4948]: I1204 18:17:20.914624 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:17:20 crc kubenswrapper[4948]: E1204 18:17:20.915177 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:17:35 crc kubenswrapper[4948]: I1204 18:17:35.914415 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:17:35 crc kubenswrapper[4948]: E1204 18:17:35.915185 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:17:50 crc kubenswrapper[4948]: I1204 18:17:50.918659 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:17:50 crc kubenswrapper[4948]: E1204 18:17:50.919575 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:18:05 crc kubenswrapper[4948]: I1204 18:18:05.914863 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:18:05 crc kubenswrapper[4948]: E1204 18:18:05.916347 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:18:19 crc kubenswrapper[4948]: I1204 18:18:19.914021 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:18:19 crc kubenswrapper[4948]: E1204 18:18:19.915406 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:18:32 crc kubenswrapper[4948]: I1204 18:18:32.914383 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:18:32 crc kubenswrapper[4948]: E1204 18:18:32.915180 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:18:47 crc kubenswrapper[4948]: I1204 18:18:47.913484 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:18:47 crc kubenswrapper[4948]: E1204 18:18:47.914293 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:19:02 crc kubenswrapper[4948]: I1204 18:19:02.914583 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:19:02 crc kubenswrapper[4948]: E1204 18:19:02.915548 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:19:17 crc kubenswrapper[4948]: I1204 18:19:17.914156 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:19:17 crc kubenswrapper[4948]: E1204 18:19:17.914917 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:19:29 crc kubenswrapper[4948]: I1204 18:19:29.914351 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:19:29 crc kubenswrapper[4948]: E1204 18:19:29.916877 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:19:41 crc kubenswrapper[4948]: I1204 18:19:41.397947 4948 scope.go:117] "RemoveContainer" containerID="ea994cd91f9baa90e3ecd42d4f0ebf605770803202db56a978ef1ee757bcb887" Dec 04 18:19:41 crc kubenswrapper[4948]: I1204 18:19:41.433658 4948 scope.go:117] "RemoveContainer" containerID="c69e95693f503f152c10fe0c0e9e608cfa189a93eb19dd7a0b5f2626cf89cd72" Dec 04 18:19:41 crc kubenswrapper[4948]: I1204 18:19:41.473291 4948 scope.go:117] "RemoveContainer" containerID="62d0dc724a2ba9607c7ce9a29d8e6db18c09cde9fb954a40a648a39dbee3346f" Dec 04 18:19:41 crc kubenswrapper[4948]: I1204 18:19:41.913761 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:19:41 crc kubenswrapper[4948]: E1204 18:19:41.914133 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:19:54 crc kubenswrapper[4948]: I1204 18:19:54.914787 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:19:54 crc kubenswrapper[4948]: E1204 18:19:54.916003 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:20:07 crc kubenswrapper[4948]: I1204 18:20:07.913817 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:20:07 crc kubenswrapper[4948]: E1204 18:20:07.914813 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:20:22 crc kubenswrapper[4948]: I1204 18:20:22.916208 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:20:22 crc kubenswrapper[4948]: E1204 18:20:22.917377 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.558597 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-spvjx"] Dec 04 18:20:36 crc kubenswrapper[4948]: E1204 18:20:36.559776 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0039de0-35fb-4079-81a8-d502073646f7" containerName="collect-profiles" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.560452 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0039de0-35fb-4079-81a8-d502073646f7" containerName="collect-profiles" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.560711 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0039de0-35fb-4079-81a8-d502073646f7" containerName="collect-profiles" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.562267 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.568791 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-spvjx"] Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.637232 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-utilities\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.637292 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64lsv\" (UniqueName: \"kubernetes.io/projected/3463b0ed-a23b-40e0-98c8-aca63b7acd14-kube-api-access-64lsv\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.637330 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-catalog-content\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.738906 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-utilities\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.739018 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64lsv\" (UniqueName: \"kubernetes.io/projected/3463b0ed-a23b-40e0-98c8-aca63b7acd14-kube-api-access-64lsv\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.739115 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-catalog-content\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.739489 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-utilities\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.739506 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-catalog-content\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.760050 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64lsv\" (UniqueName: \"kubernetes.io/projected/3463b0ed-a23b-40e0-98c8-aca63b7acd14-kube-api-access-64lsv\") pod \"certified-operators-spvjx\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.878839 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:36 crc kubenswrapper[4948]: I1204 18:20:36.913480 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:20:36 crc kubenswrapper[4948]: E1204 18:20:36.913812 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:20:37 crc kubenswrapper[4948]: I1204 18:20:37.147175 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-spvjx"] Dec 04 18:20:37 crc kubenswrapper[4948]: I1204 18:20:37.793324 4948 generic.go:334] "Generic (PLEG): container finished" podID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerID="31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3" exitCode=0 Dec 04 18:20:37 crc kubenswrapper[4948]: I1204 18:20:37.793410 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvjx" event={"ID":"3463b0ed-a23b-40e0-98c8-aca63b7acd14","Type":"ContainerDied","Data":"31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3"} Dec 04 18:20:37 crc kubenswrapper[4948]: I1204 18:20:37.793738 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvjx" event={"ID":"3463b0ed-a23b-40e0-98c8-aca63b7acd14","Type":"ContainerStarted","Data":"6a0a05a65a72ddef7ea46b50b6002037aaefaba925ee67ae52e8de65d37eb0bb"} Dec 04 18:20:37 crc kubenswrapper[4948]: I1204 18:20:37.796946 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 18:20:38 crc kubenswrapper[4948]: I1204 18:20:38.802382 4948 generic.go:334] "Generic (PLEG): container finished" podID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerID="08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27" exitCode=0 Dec 04 18:20:38 crc kubenswrapper[4948]: I1204 18:20:38.802458 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvjx" event={"ID":"3463b0ed-a23b-40e0-98c8-aca63b7acd14","Type":"ContainerDied","Data":"08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27"} Dec 04 18:20:39 crc kubenswrapper[4948]: I1204 18:20:39.814398 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvjx" event={"ID":"3463b0ed-a23b-40e0-98c8-aca63b7acd14","Type":"ContainerStarted","Data":"6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f"} Dec 04 18:20:39 crc kubenswrapper[4948]: I1204 18:20:39.839858 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-spvjx" podStartSLOduration=2.41622042 podStartE2EDuration="3.8398367s" podCreationTimestamp="2025-12-04 18:20:36 +0000 UTC" firstStartedPulling="2025-12-04 18:20:37.795981966 +0000 UTC m=+3249.157056408" lastFinishedPulling="2025-12-04 18:20:39.219598286 +0000 UTC m=+3250.580672688" observedRunningTime="2025-12-04 18:20:39.833547625 +0000 UTC m=+3251.194622027" watchObservedRunningTime="2025-12-04 18:20:39.8398367 +0000 UTC m=+3251.200911092" Dec 04 18:20:46 crc kubenswrapper[4948]: I1204 18:20:46.879761 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:46 crc kubenswrapper[4948]: I1204 18:20:46.881294 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:46 crc kubenswrapper[4948]: I1204 18:20:46.949775 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:47 crc kubenswrapper[4948]: I1204 18:20:47.916288 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:47 crc kubenswrapper[4948]: I1204 18:20:47.965716 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-spvjx"] Dec 04 18:20:49 crc kubenswrapper[4948]: I1204 18:20:49.892779 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-spvjx" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerName="registry-server" containerID="cri-o://6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f" gracePeriod=2 Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.818792 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.900516 4948 generic.go:334] "Generic (PLEG): container finished" podID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerID="6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f" exitCode=0 Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.900556 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvjx" event={"ID":"3463b0ed-a23b-40e0-98c8-aca63b7acd14","Type":"ContainerDied","Data":"6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f"} Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.900561 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spvjx" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.900588 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spvjx" event={"ID":"3463b0ed-a23b-40e0-98c8-aca63b7acd14","Type":"ContainerDied","Data":"6a0a05a65a72ddef7ea46b50b6002037aaefaba925ee67ae52e8de65d37eb0bb"} Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.900605 4948 scope.go:117] "RemoveContainer" containerID="6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.913438 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.918401 4948 scope.go:117] "RemoveContainer" containerID="08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.944307 4948 scope.go:117] "RemoveContainer" containerID="31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.950926 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64lsv\" (UniqueName: \"kubernetes.io/projected/3463b0ed-a23b-40e0-98c8-aca63b7acd14-kube-api-access-64lsv\") pod \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.951408 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-catalog-content\") pod \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.951564 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-utilities\") pod \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\" (UID: \"3463b0ed-a23b-40e0-98c8-aca63b7acd14\") " Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.953018 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-utilities" (OuterVolumeSpecName: "utilities") pod "3463b0ed-a23b-40e0-98c8-aca63b7acd14" (UID: "3463b0ed-a23b-40e0-98c8-aca63b7acd14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.956750 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3463b0ed-a23b-40e0-98c8-aca63b7acd14-kube-api-access-64lsv" (OuterVolumeSpecName: "kube-api-access-64lsv") pod "3463b0ed-a23b-40e0-98c8-aca63b7acd14" (UID: "3463b0ed-a23b-40e0-98c8-aca63b7acd14"). InnerVolumeSpecName "kube-api-access-64lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.994606 4948 scope.go:117] "RemoveContainer" containerID="6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f" Dec 04 18:20:50 crc kubenswrapper[4948]: E1204 18:20:50.995087 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f\": container with ID starting with 6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f not found: ID does not exist" containerID="6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.995156 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f"} err="failed to get container status \"6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f\": rpc error: code = NotFound desc = could not find container \"6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f\": container with ID starting with 6a0593f797422c45971b49c5c0343f51ee316746212e3a0bd76c6e18b2d7765f not found: ID does not exist" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.995382 4948 scope.go:117] "RemoveContainer" containerID="08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27" Dec 04 18:20:50 crc kubenswrapper[4948]: E1204 18:20:50.996011 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27\": container with ID starting with 08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27 not found: ID does not exist" containerID="08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.996065 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27"} err="failed to get container status \"08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27\": rpc error: code = NotFound desc = could not find container \"08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27\": container with ID starting with 08140f06c7cc978b96542f038c721957e6cd5cf953c97726e89f88a042bf1e27 not found: ID does not exist" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.996095 4948 scope.go:117] "RemoveContainer" containerID="31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3" Dec 04 18:20:50 crc kubenswrapper[4948]: E1204 18:20:50.996415 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3\": container with ID starting with 31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3 not found: ID does not exist" containerID="31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3" Dec 04 18:20:50 crc kubenswrapper[4948]: I1204 18:20:50.996461 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3"} err="failed to get container status \"31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3\": rpc error: code = NotFound desc = could not find container \"31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3\": container with ID starting with 31d4b0e8d81cb926e898bf4c1d93da5dd6b6c6bb8c6cec6bf7b5c59cb7d023e3 not found: ID does not exist" Dec 04 18:20:51 crc kubenswrapper[4948]: I1204 18:20:51.007835 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3463b0ed-a23b-40e0-98c8-aca63b7acd14" (UID: "3463b0ed-a23b-40e0-98c8-aca63b7acd14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:20:51 crc kubenswrapper[4948]: I1204 18:20:51.052818 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:20:51 crc kubenswrapper[4948]: I1204 18:20:51.052844 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64lsv\" (UniqueName: \"kubernetes.io/projected/3463b0ed-a23b-40e0-98c8-aca63b7acd14-kube-api-access-64lsv\") on node \"crc\" DevicePath \"\"" Dec 04 18:20:51 crc kubenswrapper[4948]: I1204 18:20:51.052855 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3463b0ed-a23b-40e0-98c8-aca63b7acd14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:20:51 crc kubenswrapper[4948]: I1204 18:20:51.230562 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-spvjx"] Dec 04 18:20:51 crc kubenswrapper[4948]: I1204 18:20:51.236676 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-spvjx"] Dec 04 18:20:51 crc kubenswrapper[4948]: I1204 18:20:51.910635 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"4c59610ee7dadf15d6c04baf6ebec83099183e48e42ed1847cea1c3f2d15bca2"} Dec 04 18:20:52 crc kubenswrapper[4948]: I1204 18:20:52.925977 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" path="/var/lib/kubelet/pods/3463b0ed-a23b-40e0-98c8-aca63b7acd14/volumes" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.545553 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tz242"] Dec 04 18:21:50 crc kubenswrapper[4948]: E1204 18:21:50.548157 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerName="extract-utilities" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.548289 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerName="extract-utilities" Dec 04 18:21:50 crc kubenswrapper[4948]: E1204 18:21:50.548380 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerName="extract-content" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.548466 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerName="extract-content" Dec 04 18:21:50 crc kubenswrapper[4948]: E1204 18:21:50.548554 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerName="registry-server" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.548637 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerName="registry-server" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.548905 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="3463b0ed-a23b-40e0-98c8-aca63b7acd14" containerName="registry-server" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.550293 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.554016 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz242"] Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.661691 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-utilities\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.661746 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-catalog-content\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.661790 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfk5x\" (UniqueName: \"kubernetes.io/projected/996ed6ef-071e-4f17-8a47-71889c18657c-kube-api-access-hfk5x\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.763331 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfk5x\" (UniqueName: \"kubernetes.io/projected/996ed6ef-071e-4f17-8a47-71889c18657c-kube-api-access-hfk5x\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.763451 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-utilities\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.763493 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-catalog-content\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.764332 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-catalog-content\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.765598 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-utilities\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.786883 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfk5x\" (UniqueName: \"kubernetes.io/projected/996ed6ef-071e-4f17-8a47-71889c18657c-kube-api-access-hfk5x\") pod \"redhat-marketplace-tz242\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:50 crc kubenswrapper[4948]: I1204 18:21:50.866044 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:21:51 crc kubenswrapper[4948]: I1204 18:21:51.301318 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz242"] Dec 04 18:21:51 crc kubenswrapper[4948]: I1204 18:21:51.406869 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz242" event={"ID":"996ed6ef-071e-4f17-8a47-71889c18657c","Type":"ContainerStarted","Data":"3c80925422f59fe05f5899753e2620b7e9a9c98220bc6e55db2926d221020e6d"} Dec 04 18:21:52 crc kubenswrapper[4948]: I1204 18:21:52.417856 4948 generic.go:334] "Generic (PLEG): container finished" podID="996ed6ef-071e-4f17-8a47-71889c18657c" containerID="2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92" exitCode=0 Dec 04 18:21:52 crc kubenswrapper[4948]: I1204 18:21:52.417999 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz242" event={"ID":"996ed6ef-071e-4f17-8a47-71889c18657c","Type":"ContainerDied","Data":"2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92"} Dec 04 18:21:53 crc kubenswrapper[4948]: I1204 18:21:53.432106 4948 generic.go:334] "Generic (PLEG): container finished" podID="996ed6ef-071e-4f17-8a47-71889c18657c" containerID="5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217" exitCode=0 Dec 04 18:21:53 crc kubenswrapper[4948]: I1204 18:21:53.432166 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz242" event={"ID":"996ed6ef-071e-4f17-8a47-71889c18657c","Type":"ContainerDied","Data":"5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217"} Dec 04 18:21:54 crc kubenswrapper[4948]: I1204 18:21:54.446422 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz242" event={"ID":"996ed6ef-071e-4f17-8a47-71889c18657c","Type":"ContainerStarted","Data":"b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815"} Dec 04 18:21:54 crc kubenswrapper[4948]: I1204 18:21:54.471923 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tz242" podStartSLOduration=2.782493481 podStartE2EDuration="4.471900632s" podCreationTimestamp="2025-12-04 18:21:50 +0000 UTC" firstStartedPulling="2025-12-04 18:21:52.42021817 +0000 UTC m=+3323.781292582" lastFinishedPulling="2025-12-04 18:21:54.109625331 +0000 UTC m=+3325.470699733" observedRunningTime="2025-12-04 18:21:54.469378511 +0000 UTC m=+3325.830452913" watchObservedRunningTime="2025-12-04 18:21:54.471900632 +0000 UTC m=+3325.832975044" Dec 04 18:22:00 crc kubenswrapper[4948]: I1204 18:22:00.866448 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:22:00 crc kubenswrapper[4948]: I1204 18:22:00.867009 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:22:00 crc kubenswrapper[4948]: I1204 18:22:00.926173 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:22:01 crc kubenswrapper[4948]: I1204 18:22:01.563355 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:22:01 crc kubenswrapper[4948]: I1204 18:22:01.620700 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz242"] Dec 04 18:22:03 crc kubenswrapper[4948]: I1204 18:22:03.515714 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tz242" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" containerName="registry-server" containerID="cri-o://b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815" gracePeriod=2 Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.471747 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.524361 4948 generic.go:334] "Generic (PLEG): container finished" podID="996ed6ef-071e-4f17-8a47-71889c18657c" containerID="b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815" exitCode=0 Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.524402 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz242" event={"ID":"996ed6ef-071e-4f17-8a47-71889c18657c","Type":"ContainerDied","Data":"b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815"} Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.524447 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tz242" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.524473 4948 scope.go:117] "RemoveContainer" containerID="b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.524461 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tz242" event={"ID":"996ed6ef-071e-4f17-8a47-71889c18657c","Type":"ContainerDied","Data":"3c80925422f59fe05f5899753e2620b7e9a9c98220bc6e55db2926d221020e6d"} Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.545252 4948 scope.go:117] "RemoveContainer" containerID="5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.563027 4948 scope.go:117] "RemoveContainer" containerID="2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.563887 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfk5x\" (UniqueName: \"kubernetes.io/projected/996ed6ef-071e-4f17-8a47-71889c18657c-kube-api-access-hfk5x\") pod \"996ed6ef-071e-4f17-8a47-71889c18657c\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.563959 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-catalog-content\") pod \"996ed6ef-071e-4f17-8a47-71889c18657c\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.564082 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-utilities\") pod \"996ed6ef-071e-4f17-8a47-71889c18657c\" (UID: \"996ed6ef-071e-4f17-8a47-71889c18657c\") " Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.565399 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-utilities" (OuterVolumeSpecName: "utilities") pod "996ed6ef-071e-4f17-8a47-71889c18657c" (UID: "996ed6ef-071e-4f17-8a47-71889c18657c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.565722 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.571503 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996ed6ef-071e-4f17-8a47-71889c18657c-kube-api-access-hfk5x" (OuterVolumeSpecName: "kube-api-access-hfk5x") pod "996ed6ef-071e-4f17-8a47-71889c18657c" (UID: "996ed6ef-071e-4f17-8a47-71889c18657c"). InnerVolumeSpecName "kube-api-access-hfk5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.584564 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "996ed6ef-071e-4f17-8a47-71889c18657c" (UID: "996ed6ef-071e-4f17-8a47-71889c18657c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.613790 4948 scope.go:117] "RemoveContainer" containerID="b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815" Dec 04 18:22:04 crc kubenswrapper[4948]: E1204 18:22:04.614291 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815\": container with ID starting with b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815 not found: ID does not exist" containerID="b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.614333 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815"} err="failed to get container status \"b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815\": rpc error: code = NotFound desc = could not find container \"b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815\": container with ID starting with b341af5607505b5097bb7d2014134e315cba7a8b918c3d2335ae09cc1128b815 not found: ID does not exist" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.614361 4948 scope.go:117] "RemoveContainer" containerID="5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217" Dec 04 18:22:04 crc kubenswrapper[4948]: E1204 18:22:04.614672 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217\": container with ID starting with 5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217 not found: ID does not exist" containerID="5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.614699 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217"} err="failed to get container status \"5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217\": rpc error: code = NotFound desc = could not find container \"5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217\": container with ID starting with 5e315d04205f4cf1a54e5a621e6c0eed63530f11fe35628476e7e31fe9682217 not found: ID does not exist" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.614720 4948 scope.go:117] "RemoveContainer" containerID="2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92" Dec 04 18:22:04 crc kubenswrapper[4948]: E1204 18:22:04.614981 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92\": container with ID starting with 2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92 not found: ID does not exist" containerID="2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.615034 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92"} err="failed to get container status \"2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92\": rpc error: code = NotFound desc = could not find container \"2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92\": container with ID starting with 2750383c4835f1e5bf94c28b2560e79f0f0faf1cfda175e2da7f4fbca1cd6f92 not found: ID does not exist" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.667256 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfk5x\" (UniqueName: \"kubernetes.io/projected/996ed6ef-071e-4f17-8a47-71889c18657c-kube-api-access-hfk5x\") on node \"crc\" DevicePath \"\"" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.667332 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/996ed6ef-071e-4f17-8a47-71889c18657c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.879984 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz242"] Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.890197 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tz242"] Dec 04 18:22:04 crc kubenswrapper[4948]: I1204 18:22:04.927180 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" path="/var/lib/kubelet/pods/996ed6ef-071e-4f17-8a47-71889c18657c/volumes" Dec 04 18:23:10 crc kubenswrapper[4948]: I1204 18:23:10.625184 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:23:10 crc kubenswrapper[4948]: I1204 18:23:10.625783 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.624984 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.625594 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.956546 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fz7dx"] Dec 04 18:23:40 crc kubenswrapper[4948]: E1204 18:23:40.956888 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" containerName="extract-content" Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.956906 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" containerName="extract-content" Dec 04 18:23:40 crc kubenswrapper[4948]: E1204 18:23:40.956920 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" containerName="registry-server" Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.956927 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" containerName="registry-server" Dec 04 18:23:40 crc kubenswrapper[4948]: E1204 18:23:40.956954 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" containerName="extract-utilities" Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.956961 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" containerName="extract-utilities" Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.957151 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="996ed6ef-071e-4f17-8a47-71889c18657c" containerName="registry-server" Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.958451 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:40 crc kubenswrapper[4948]: I1204 18:23:40.964819 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fz7dx"] Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.076813 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/399729d9-b154-4bfd-9a86-19a9a763f038-kube-api-access-7tdd8\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.077162 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-utilities\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.077287 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-catalog-content\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.178824 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/399729d9-b154-4bfd-9a86-19a9a763f038-kube-api-access-7tdd8\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.179185 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-utilities\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.179313 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-catalog-content\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.179670 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-utilities\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.179758 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-catalog-content\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.199952 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/399729d9-b154-4bfd-9a86-19a9a763f038-kube-api-access-7tdd8\") pod \"community-operators-fz7dx\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.277483 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.761729 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fz7dx"] Dec 04 18:23:41 crc kubenswrapper[4948]: I1204 18:23:41.878442 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fz7dx" event={"ID":"399729d9-b154-4bfd-9a86-19a9a763f038","Type":"ContainerStarted","Data":"03aaf4817dcb7c5faaac8bc66ab2a7c79f30335f5871c5cfefd9fbfc9cf7e946"} Dec 04 18:23:42 crc kubenswrapper[4948]: I1204 18:23:42.890438 4948 generic.go:334] "Generic (PLEG): container finished" podID="399729d9-b154-4bfd-9a86-19a9a763f038" containerID="b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587" exitCode=0 Dec 04 18:23:42 crc kubenswrapper[4948]: I1204 18:23:42.890471 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fz7dx" event={"ID":"399729d9-b154-4bfd-9a86-19a9a763f038","Type":"ContainerDied","Data":"b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587"} Dec 04 18:23:43 crc kubenswrapper[4948]: I1204 18:23:43.902757 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fz7dx" event={"ID":"399729d9-b154-4bfd-9a86-19a9a763f038","Type":"ContainerStarted","Data":"d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651"} Dec 04 18:23:44 crc kubenswrapper[4948]: I1204 18:23:44.914482 4948 generic.go:334] "Generic (PLEG): container finished" podID="399729d9-b154-4bfd-9a86-19a9a763f038" containerID="d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651" exitCode=0 Dec 04 18:23:44 crc kubenswrapper[4948]: I1204 18:23:44.929702 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fz7dx" event={"ID":"399729d9-b154-4bfd-9a86-19a9a763f038","Type":"ContainerDied","Data":"d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651"} Dec 04 18:23:45 crc kubenswrapper[4948]: I1204 18:23:45.923519 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fz7dx" event={"ID":"399729d9-b154-4bfd-9a86-19a9a763f038","Type":"ContainerStarted","Data":"e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5"} Dec 04 18:23:45 crc kubenswrapper[4948]: I1204 18:23:45.946376 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fz7dx" podStartSLOduration=3.319361222 podStartE2EDuration="5.946350485s" podCreationTimestamp="2025-12-04 18:23:40 +0000 UTC" firstStartedPulling="2025-12-04 18:23:42.893101177 +0000 UTC m=+3434.254175589" lastFinishedPulling="2025-12-04 18:23:45.52009044 +0000 UTC m=+3436.881164852" observedRunningTime="2025-12-04 18:23:45.944519424 +0000 UTC m=+3437.305593856" watchObservedRunningTime="2025-12-04 18:23:45.946350485 +0000 UTC m=+3437.307424897" Dec 04 18:23:51 crc kubenswrapper[4948]: I1204 18:23:51.278223 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:51 crc kubenswrapper[4948]: I1204 18:23:51.278740 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:51 crc kubenswrapper[4948]: I1204 18:23:51.322358 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:52 crc kubenswrapper[4948]: I1204 18:23:52.048681 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:52 crc kubenswrapper[4948]: I1204 18:23:52.106556 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fz7dx"] Dec 04 18:23:53 crc kubenswrapper[4948]: I1204 18:23:53.989194 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fz7dx" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" containerName="registry-server" containerID="cri-o://e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5" gracePeriod=2 Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.432022 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.588414 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-catalog-content\") pod \"399729d9-b154-4bfd-9a86-19a9a763f038\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.588626 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/399729d9-b154-4bfd-9a86-19a9a763f038-kube-api-access-7tdd8\") pod \"399729d9-b154-4bfd-9a86-19a9a763f038\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.588742 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-utilities\") pod \"399729d9-b154-4bfd-9a86-19a9a763f038\" (UID: \"399729d9-b154-4bfd-9a86-19a9a763f038\") " Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.592502 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-utilities" (OuterVolumeSpecName: "utilities") pod "399729d9-b154-4bfd-9a86-19a9a763f038" (UID: "399729d9-b154-4bfd-9a86-19a9a763f038"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.599423 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399729d9-b154-4bfd-9a86-19a9a763f038-kube-api-access-7tdd8" (OuterVolumeSpecName: "kube-api-access-7tdd8") pod "399729d9-b154-4bfd-9a86-19a9a763f038" (UID: "399729d9-b154-4bfd-9a86-19a9a763f038"). InnerVolumeSpecName "kube-api-access-7tdd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.647204 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "399729d9-b154-4bfd-9a86-19a9a763f038" (UID: "399729d9-b154-4bfd-9a86-19a9a763f038"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.690596 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/399729d9-b154-4bfd-9a86-19a9a763f038-kube-api-access-7tdd8\") on node \"crc\" DevicePath \"\"" Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.690641 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.690654 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/399729d9-b154-4bfd-9a86-19a9a763f038-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.999033 4948 generic.go:334] "Generic (PLEG): container finished" podID="399729d9-b154-4bfd-9a86-19a9a763f038" containerID="e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5" exitCode=0 Dec 04 18:23:54 crc kubenswrapper[4948]: I1204 18:23:54.999094 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fz7dx" event={"ID":"399729d9-b154-4bfd-9a86-19a9a763f038","Type":"ContainerDied","Data":"e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5"} Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:54.999124 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fz7dx" event={"ID":"399729d9-b154-4bfd-9a86-19a9a763f038","Type":"ContainerDied","Data":"03aaf4817dcb7c5faaac8bc66ab2a7c79f30335f5871c5cfefd9fbfc9cf7e946"} Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:54.999144 4948 scope.go:117] "RemoveContainer" containerID="e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:54.999162 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fz7dx" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.025054 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fz7dx"] Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.032021 4948 scope.go:117] "RemoveContainer" containerID="d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.033213 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fz7dx"] Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.051455 4948 scope.go:117] "RemoveContainer" containerID="b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.072840 4948 scope.go:117] "RemoveContainer" containerID="e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5" Dec 04 18:23:55 crc kubenswrapper[4948]: E1204 18:23:55.073305 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5\": container with ID starting with e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5 not found: ID does not exist" containerID="e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.073347 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5"} err="failed to get container status \"e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5\": rpc error: code = NotFound desc = could not find container \"e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5\": container with ID starting with e6e4fbc12c3b568a1c54992b6777ee99ca655c9fe327f7ab1e7418d0345800c5 not found: ID does not exist" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.073374 4948 scope.go:117] "RemoveContainer" containerID="d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651" Dec 04 18:23:55 crc kubenswrapper[4948]: E1204 18:23:55.073680 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651\": container with ID starting with d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651 not found: ID does not exist" containerID="d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.073723 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651"} err="failed to get container status \"d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651\": rpc error: code = NotFound desc = could not find container \"d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651\": container with ID starting with d81824cfded2806760a81e555fdac0074626b748cd17fe4f3fdd332a83962651 not found: ID does not exist" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.073750 4948 scope.go:117] "RemoveContainer" containerID="b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587" Dec 04 18:23:55 crc kubenswrapper[4948]: E1204 18:23:55.074397 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587\": container with ID starting with b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587 not found: ID does not exist" containerID="b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587" Dec 04 18:23:55 crc kubenswrapper[4948]: I1204 18:23:55.074425 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587"} err="failed to get container status \"b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587\": rpc error: code = NotFound desc = could not find container \"b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587\": container with ID starting with b8f9f2d2e57570db6024d9fe9294a0d121643540aeb47ae2a591dd436c02d587 not found: ID does not exist" Dec 04 18:23:56 crc kubenswrapper[4948]: I1204 18:23:56.921400 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" path="/var/lib/kubelet/pods/399729d9-b154-4bfd-9a86-19a9a763f038/volumes" Dec 04 18:24:10 crc kubenswrapper[4948]: I1204 18:24:10.625197 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:24:10 crc kubenswrapper[4948]: I1204 18:24:10.625819 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:24:10 crc kubenswrapper[4948]: I1204 18:24:10.625886 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:24:10 crc kubenswrapper[4948]: I1204 18:24:10.626628 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c59610ee7dadf15d6c04baf6ebec83099183e48e42ed1847cea1c3f2d15bca2"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:24:10 crc kubenswrapper[4948]: I1204 18:24:10.626719 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://4c59610ee7dadf15d6c04baf6ebec83099183e48e42ed1847cea1c3f2d15bca2" gracePeriod=600 Dec 04 18:24:11 crc kubenswrapper[4948]: I1204 18:24:11.150801 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="4c59610ee7dadf15d6c04baf6ebec83099183e48e42ed1847cea1c3f2d15bca2" exitCode=0 Dec 04 18:24:11 crc kubenswrapper[4948]: I1204 18:24:11.150870 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"4c59610ee7dadf15d6c04baf6ebec83099183e48e42ed1847cea1c3f2d15bca2"} Dec 04 18:24:11 crc kubenswrapper[4948]: I1204 18:24:11.151295 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115"} Dec 04 18:24:11 crc kubenswrapper[4948]: I1204 18:24:11.151321 4948 scope.go:117] "RemoveContainer" containerID="1a80b91f008df18956fba21562874d0aff4a56417a38eb3bb220889f403ef00b" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.644147 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5ssc6"] Dec 04 18:24:37 crc kubenswrapper[4948]: E1204 18:24:37.645490 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" containerName="extract-content" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.645506 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" containerName="extract-content" Dec 04 18:24:37 crc kubenswrapper[4948]: E1204 18:24:37.645522 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" containerName="registry-server" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.645528 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" containerName="registry-server" Dec 04 18:24:37 crc kubenswrapper[4948]: E1204 18:24:37.645543 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" containerName="extract-utilities" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.645552 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" containerName="extract-utilities" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.645741 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="399729d9-b154-4bfd-9a86-19a9a763f038" containerName="registry-server" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.647084 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.662835 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ssc6"] Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.815909 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2b7q\" (UniqueName: \"kubernetes.io/projected/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-kube-api-access-c2b7q\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.816013 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-utilities\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.816061 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-catalog-content\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.917514 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2b7q\" (UniqueName: \"kubernetes.io/projected/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-kube-api-access-c2b7q\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.917597 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-utilities\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.917617 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-catalog-content\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.918142 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-catalog-content\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.918171 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-utilities\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.937247 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2b7q\" (UniqueName: \"kubernetes.io/projected/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-kube-api-access-c2b7q\") pod \"redhat-operators-5ssc6\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:37 crc kubenswrapper[4948]: I1204 18:24:37.972222 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:38 crc kubenswrapper[4948]: I1204 18:24:38.431723 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ssc6"] Dec 04 18:24:39 crc kubenswrapper[4948]: I1204 18:24:39.376279 4948 generic.go:334] "Generic (PLEG): container finished" podID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerID="3ccc8efebdcd9372335c8cd9bf53b19ea9938ecab49eaa816d5ae70d19d633bc" exitCode=0 Dec 04 18:24:39 crc kubenswrapper[4948]: I1204 18:24:39.376343 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ssc6" event={"ID":"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3","Type":"ContainerDied","Data":"3ccc8efebdcd9372335c8cd9bf53b19ea9938ecab49eaa816d5ae70d19d633bc"} Dec 04 18:24:39 crc kubenswrapper[4948]: I1204 18:24:39.377353 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ssc6" event={"ID":"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3","Type":"ContainerStarted","Data":"420fc74a84138efb6afb88b3023eead07ffd70ac0a16f4765b1f973495907e64"} Dec 04 18:24:41 crc kubenswrapper[4948]: I1204 18:24:41.395727 4948 generic.go:334] "Generic (PLEG): container finished" podID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerID="0412f3e8aac4f632da15144a3d707e9dad9ec06034791b17fbaba91a60c498c6" exitCode=0 Dec 04 18:24:41 crc kubenswrapper[4948]: I1204 18:24:41.395845 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ssc6" event={"ID":"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3","Type":"ContainerDied","Data":"0412f3e8aac4f632da15144a3d707e9dad9ec06034791b17fbaba91a60c498c6"} Dec 04 18:24:42 crc kubenswrapper[4948]: I1204 18:24:42.413442 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ssc6" event={"ID":"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3","Type":"ContainerStarted","Data":"1925f29f954c5b2328a97ea7d25411cff833f7e140e335625b43ff79ee798289"} Dec 04 18:24:42 crc kubenswrapper[4948]: I1204 18:24:42.440068 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5ssc6" podStartSLOduration=2.970573411 podStartE2EDuration="5.440032756s" podCreationTimestamp="2025-12-04 18:24:37 +0000 UTC" firstStartedPulling="2025-12-04 18:24:39.378381049 +0000 UTC m=+3490.739455491" lastFinishedPulling="2025-12-04 18:24:41.847840394 +0000 UTC m=+3493.208914836" observedRunningTime="2025-12-04 18:24:42.433565865 +0000 UTC m=+3493.794640277" watchObservedRunningTime="2025-12-04 18:24:42.440032756 +0000 UTC m=+3493.801107148" Dec 04 18:24:47 crc kubenswrapper[4948]: I1204 18:24:47.972979 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:47 crc kubenswrapper[4948]: I1204 18:24:47.973353 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:48 crc kubenswrapper[4948]: I1204 18:24:48.034282 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:48 crc kubenswrapper[4948]: I1204 18:24:48.515813 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:48 crc kubenswrapper[4948]: I1204 18:24:48.573116 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ssc6"] Dec 04 18:24:50 crc kubenswrapper[4948]: I1204 18:24:50.469675 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5ssc6" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerName="registry-server" containerID="cri-o://1925f29f954c5b2328a97ea7d25411cff833f7e140e335625b43ff79ee798289" gracePeriod=2 Dec 04 18:24:51 crc kubenswrapper[4948]: I1204 18:24:51.477612 4948 generic.go:334] "Generic (PLEG): container finished" podID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerID="1925f29f954c5b2328a97ea7d25411cff833f7e140e335625b43ff79ee798289" exitCode=0 Dec 04 18:24:51 crc kubenswrapper[4948]: I1204 18:24:51.477669 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ssc6" event={"ID":"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3","Type":"ContainerDied","Data":"1925f29f954c5b2328a97ea7d25411cff833f7e140e335625b43ff79ee798289"} Dec 04 18:24:51 crc kubenswrapper[4948]: I1204 18:24:51.979179 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.142558 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2b7q\" (UniqueName: \"kubernetes.io/projected/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-kube-api-access-c2b7q\") pod \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.142691 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-utilities\") pod \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.142756 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-catalog-content\") pod \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\" (UID: \"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3\") " Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.143990 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-utilities" (OuterVolumeSpecName: "utilities") pod "975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" (UID: "975eb1d5-f6fb-40eb-8b65-6aa28880c1a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.151557 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-kube-api-access-c2b7q" (OuterVolumeSpecName: "kube-api-access-c2b7q") pod "975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" (UID: "975eb1d5-f6fb-40eb-8b65-6aa28880c1a3"). InnerVolumeSpecName "kube-api-access-c2b7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.246561 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2b7q\" (UniqueName: \"kubernetes.io/projected/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-kube-api-access-c2b7q\") on node \"crc\" DevicePath \"\"" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.246621 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.271450 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" (UID: "975eb1d5-f6fb-40eb-8b65-6aa28880c1a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.348572 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.489693 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ssc6" event={"ID":"975eb1d5-f6fb-40eb-8b65-6aa28880c1a3","Type":"ContainerDied","Data":"420fc74a84138efb6afb88b3023eead07ffd70ac0a16f4765b1f973495907e64"} Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.489779 4948 scope.go:117] "RemoveContainer" containerID="1925f29f954c5b2328a97ea7d25411cff833f7e140e335625b43ff79ee798289" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.489798 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ssc6" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.506788 4948 scope.go:117] "RemoveContainer" containerID="0412f3e8aac4f632da15144a3d707e9dad9ec06034791b17fbaba91a60c498c6" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.529164 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ssc6"] Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.533563 4948 scope.go:117] "RemoveContainer" containerID="3ccc8efebdcd9372335c8cd9bf53b19ea9938ecab49eaa816d5ae70d19d633bc" Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.536675 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5ssc6"] Dec 04 18:24:52 crc kubenswrapper[4948]: I1204 18:24:52.922074 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" path="/var/lib/kubelet/pods/975eb1d5-f6fb-40eb-8b65-6aa28880c1a3/volumes" Dec 04 18:26:10 crc kubenswrapper[4948]: I1204 18:26:10.626220 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:26:10 crc kubenswrapper[4948]: I1204 18:26:10.628746 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:26:40 crc kubenswrapper[4948]: I1204 18:26:40.625608 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:26:40 crc kubenswrapper[4948]: I1204 18:26:40.626033 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:27:10 crc kubenswrapper[4948]: I1204 18:27:10.625265 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:27:10 crc kubenswrapper[4948]: I1204 18:27:10.626084 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:27:10 crc kubenswrapper[4948]: I1204 18:27:10.626163 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:27:10 crc kubenswrapper[4948]: I1204 18:27:10.626957 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:27:10 crc kubenswrapper[4948]: I1204 18:27:10.627114 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" gracePeriod=600 Dec 04 18:27:10 crc kubenswrapper[4948]: E1204 18:27:10.750548 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:27:11 crc kubenswrapper[4948]: I1204 18:27:11.683627 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" exitCode=0 Dec 04 18:27:11 crc kubenswrapper[4948]: I1204 18:27:11.683685 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115"} Dec 04 18:27:11 crc kubenswrapper[4948]: I1204 18:27:11.683745 4948 scope.go:117] "RemoveContainer" containerID="4c59610ee7dadf15d6c04baf6ebec83099183e48e42ed1847cea1c3f2d15bca2" Dec 04 18:27:11 crc kubenswrapper[4948]: I1204 18:27:11.684332 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:27:11 crc kubenswrapper[4948]: E1204 18:27:11.684576 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:27:25 crc kubenswrapper[4948]: I1204 18:27:25.913952 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:27:25 crc kubenswrapper[4948]: E1204 18:27:25.915002 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:27:36 crc kubenswrapper[4948]: I1204 18:27:36.914377 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:27:36 crc kubenswrapper[4948]: E1204 18:27:36.915670 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:27:48 crc kubenswrapper[4948]: I1204 18:27:48.928234 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:27:48 crc kubenswrapper[4948]: E1204 18:27:48.929071 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:28:00 crc kubenswrapper[4948]: I1204 18:28:00.914829 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:28:00 crc kubenswrapper[4948]: E1204 18:28:00.915872 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:28:14 crc kubenswrapper[4948]: I1204 18:28:14.913495 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:28:14 crc kubenswrapper[4948]: E1204 18:28:14.914266 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:28:26 crc kubenswrapper[4948]: I1204 18:28:26.913807 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:28:26 crc kubenswrapper[4948]: E1204 18:28:26.914576 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:28:37 crc kubenswrapper[4948]: I1204 18:28:37.913520 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:28:37 crc kubenswrapper[4948]: E1204 18:28:37.914183 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:28:49 crc kubenswrapper[4948]: I1204 18:28:49.913219 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:28:49 crc kubenswrapper[4948]: E1204 18:28:49.913949 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:29:03 crc kubenswrapper[4948]: I1204 18:29:03.913847 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:29:03 crc kubenswrapper[4948]: E1204 18:29:03.914512 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:29:17 crc kubenswrapper[4948]: I1204 18:29:17.914474 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:29:17 crc kubenswrapper[4948]: E1204 18:29:17.915501 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:29:32 crc kubenswrapper[4948]: I1204 18:29:32.914409 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:29:32 crc kubenswrapper[4948]: E1204 18:29:32.915522 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:29:45 crc kubenswrapper[4948]: I1204 18:29:45.914207 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:29:45 crc kubenswrapper[4948]: E1204 18:29:45.914896 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:29:56 crc kubenswrapper[4948]: I1204 18:29:56.919482 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:29:56 crc kubenswrapper[4948]: E1204 18:29:56.922138 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.197773 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2"] Dec 04 18:30:00 crc kubenswrapper[4948]: E1204 18:30:00.198366 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerName="extract-utilities" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.198381 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerName="extract-utilities" Dec 04 18:30:00 crc kubenswrapper[4948]: E1204 18:30:00.198401 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerName="registry-server" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.198417 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerName="registry-server" Dec 04 18:30:00 crc kubenswrapper[4948]: E1204 18:30:00.198428 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerName="extract-content" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.198434 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerName="extract-content" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.198593 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="975eb1d5-f6fb-40eb-8b65-6aa28880c1a3" containerName="registry-server" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.199067 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.203912 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.204708 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.218495 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2"] Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.302068 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7mnh\" (UniqueName: \"kubernetes.io/projected/95c637be-0882-49a5-817c-88a2784c6e43-kube-api-access-g7mnh\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.302331 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95c637be-0882-49a5-817c-88a2784c6e43-config-volume\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.302476 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95c637be-0882-49a5-817c-88a2784c6e43-secret-volume\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.403293 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7mnh\" (UniqueName: \"kubernetes.io/projected/95c637be-0882-49a5-817c-88a2784c6e43-kube-api-access-g7mnh\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.403578 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95c637be-0882-49a5-817c-88a2784c6e43-config-volume\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.403668 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95c637be-0882-49a5-817c-88a2784c6e43-secret-volume\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.404473 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95c637be-0882-49a5-817c-88a2784c6e43-config-volume\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.409273 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95c637be-0882-49a5-817c-88a2784c6e43-secret-volume\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.424243 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7mnh\" (UniqueName: \"kubernetes.io/projected/95c637be-0882-49a5-817c-88a2784c6e43-kube-api-access-g7mnh\") pod \"collect-profiles-29414550-zp6t2\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.529979 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:00 crc kubenswrapper[4948]: I1204 18:30:00.751360 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2"] Dec 04 18:30:01 crc kubenswrapper[4948]: I1204 18:30:01.128490 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" event={"ID":"95c637be-0882-49a5-817c-88a2784c6e43","Type":"ContainerStarted","Data":"a21c051a1b24077ac0b870a0b3fda39a87049874bca9786ea53a727be233a702"} Dec 04 18:30:01 crc kubenswrapper[4948]: I1204 18:30:01.128543 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" event={"ID":"95c637be-0882-49a5-817c-88a2784c6e43","Type":"ContainerStarted","Data":"911100219fa63bc026828cf0a3c71a762ecbb05d1a58c895a79cafb4fd577c9f"} Dec 04 18:30:02 crc kubenswrapper[4948]: I1204 18:30:02.139229 4948 generic.go:334] "Generic (PLEG): container finished" podID="95c637be-0882-49a5-817c-88a2784c6e43" containerID="a21c051a1b24077ac0b870a0b3fda39a87049874bca9786ea53a727be233a702" exitCode=0 Dec 04 18:30:02 crc kubenswrapper[4948]: I1204 18:30:02.139319 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" event={"ID":"95c637be-0882-49a5-817c-88a2784c6e43","Type":"ContainerDied","Data":"a21c051a1b24077ac0b870a0b3fda39a87049874bca9786ea53a727be233a702"} Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.436776 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.547950 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7mnh\" (UniqueName: \"kubernetes.io/projected/95c637be-0882-49a5-817c-88a2784c6e43-kube-api-access-g7mnh\") pod \"95c637be-0882-49a5-817c-88a2784c6e43\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.550395 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95c637be-0882-49a5-817c-88a2784c6e43-secret-volume\") pod \"95c637be-0882-49a5-817c-88a2784c6e43\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.550446 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95c637be-0882-49a5-817c-88a2784c6e43-config-volume\") pod \"95c637be-0882-49a5-817c-88a2784c6e43\" (UID: \"95c637be-0882-49a5-817c-88a2784c6e43\") " Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.551614 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c637be-0882-49a5-817c-88a2784c6e43-config-volume" (OuterVolumeSpecName: "config-volume") pod "95c637be-0882-49a5-817c-88a2784c6e43" (UID: "95c637be-0882-49a5-817c-88a2784c6e43"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.563442 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c637be-0882-49a5-817c-88a2784c6e43-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "95c637be-0882-49a5-817c-88a2784c6e43" (UID: "95c637be-0882-49a5-817c-88a2784c6e43"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.568013 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c637be-0882-49a5-817c-88a2784c6e43-kube-api-access-g7mnh" (OuterVolumeSpecName: "kube-api-access-g7mnh") pod "95c637be-0882-49a5-817c-88a2784c6e43" (UID: "95c637be-0882-49a5-817c-88a2784c6e43"). InnerVolumeSpecName "kube-api-access-g7mnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.652566 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7mnh\" (UniqueName: \"kubernetes.io/projected/95c637be-0882-49a5-817c-88a2784c6e43-kube-api-access-g7mnh\") on node \"crc\" DevicePath \"\"" Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.652608 4948 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/95c637be-0882-49a5-817c-88a2784c6e43-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 18:30:03 crc kubenswrapper[4948]: I1204 18:30:03.652621 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/95c637be-0882-49a5-817c-88a2784c6e43-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 18:30:04 crc kubenswrapper[4948]: I1204 18:30:04.159528 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" Dec 04 18:30:04 crc kubenswrapper[4948]: I1204 18:30:04.159420 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414550-zp6t2" event={"ID":"95c637be-0882-49a5-817c-88a2784c6e43","Type":"ContainerDied","Data":"911100219fa63bc026828cf0a3c71a762ecbb05d1a58c895a79cafb4fd577c9f"} Dec 04 18:30:04 crc kubenswrapper[4948]: I1204 18:30:04.160453 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="911100219fa63bc026828cf0a3c71a762ecbb05d1a58c895a79cafb4fd577c9f" Dec 04 18:30:04 crc kubenswrapper[4948]: I1204 18:30:04.509692 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv"] Dec 04 18:30:04 crc kubenswrapper[4948]: I1204 18:30:04.515377 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414505-tckhv"] Dec 04 18:30:04 crc kubenswrapper[4948]: I1204 18:30:04.925214 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59db2ba4-208f-47bc-87f4-3c357f18db23" path="/var/lib/kubelet/pods/59db2ba4-208f-47bc-87f4-3c357f18db23/volumes" Dec 04 18:30:07 crc kubenswrapper[4948]: I1204 18:30:07.914076 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:30:07 crc kubenswrapper[4948]: E1204 18:30:07.914826 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:30:20 crc kubenswrapper[4948]: I1204 18:30:20.914228 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:30:20 crc kubenswrapper[4948]: E1204 18:30:20.915197 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:30:34 crc kubenswrapper[4948]: I1204 18:30:34.913932 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:30:34 crc kubenswrapper[4948]: E1204 18:30:34.914889 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:30:41 crc kubenswrapper[4948]: I1204 18:30:41.763670 4948 scope.go:117] "RemoveContainer" containerID="18646656f7c3653af7102e027e29cb28e124f4eb8e9ef366b86bdcbd5c4ea11a" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.108247 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8wmds"] Dec 04 18:30:45 crc kubenswrapper[4948]: E1204 18:30:45.109140 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c637be-0882-49a5-817c-88a2784c6e43" containerName="collect-profiles" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.109162 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c637be-0882-49a5-817c-88a2784c6e43" containerName="collect-profiles" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.109434 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c637be-0882-49a5-817c-88a2784c6e43" containerName="collect-profiles" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.111286 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.128255 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8wmds"] Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.239626 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-utilities\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.240032 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-catalog-content\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.240136 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x9sw\" (UniqueName: \"kubernetes.io/projected/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-kube-api-access-5x9sw\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.341135 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-utilities\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.341208 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-catalog-content\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.341252 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x9sw\" (UniqueName: \"kubernetes.io/projected/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-kube-api-access-5x9sw\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.341741 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-utilities\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.341822 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-catalog-content\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.359199 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x9sw\" (UniqueName: \"kubernetes.io/projected/98f8ed4c-aff2-4925-bd20-9c87ae114c9c-kube-api-access-5x9sw\") pod \"certified-operators-8wmds\" (UID: \"98f8ed4c-aff2-4925-bd20-9c87ae114c9c\") " pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.450932 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:45 crc kubenswrapper[4948]: I1204 18:30:45.927494 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8wmds"] Dec 04 18:30:46 crc kubenswrapper[4948]: I1204 18:30:46.603393 4948 generic.go:334] "Generic (PLEG): container finished" podID="98f8ed4c-aff2-4925-bd20-9c87ae114c9c" containerID="15cb1d92e03bfb769b94493d5e33002c2eb10c50aa00e6b278ea8e8b2ac00e1c" exitCode=0 Dec 04 18:30:46 crc kubenswrapper[4948]: I1204 18:30:46.603452 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wmds" event={"ID":"98f8ed4c-aff2-4925-bd20-9c87ae114c9c","Type":"ContainerDied","Data":"15cb1d92e03bfb769b94493d5e33002c2eb10c50aa00e6b278ea8e8b2ac00e1c"} Dec 04 18:30:46 crc kubenswrapper[4948]: I1204 18:30:46.603489 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wmds" event={"ID":"98f8ed4c-aff2-4925-bd20-9c87ae114c9c","Type":"ContainerStarted","Data":"d215fe9f676010295adf3d2be15776165e40c97dee725292a3496ec24b4f5e92"} Dec 04 18:30:46 crc kubenswrapper[4948]: I1204 18:30:46.605664 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 18:30:47 crc kubenswrapper[4948]: I1204 18:30:47.915161 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:30:47 crc kubenswrapper[4948]: E1204 18:30:47.915857 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:30:51 crc kubenswrapper[4948]: I1204 18:30:51.652315 4948 generic.go:334] "Generic (PLEG): container finished" podID="98f8ed4c-aff2-4925-bd20-9c87ae114c9c" containerID="8819233018a1bd406ba690fd13d83e844b09ec0965c45e16edf85171b52ae83c" exitCode=0 Dec 04 18:30:51 crc kubenswrapper[4948]: I1204 18:30:51.652483 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wmds" event={"ID":"98f8ed4c-aff2-4925-bd20-9c87ae114c9c","Type":"ContainerDied","Data":"8819233018a1bd406ba690fd13d83e844b09ec0965c45e16edf85171b52ae83c"} Dec 04 18:30:52 crc kubenswrapper[4948]: I1204 18:30:52.667375 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8wmds" event={"ID":"98f8ed4c-aff2-4925-bd20-9c87ae114c9c","Type":"ContainerStarted","Data":"75f83649e7d081c6fec70ec4188121964fda281c20d5817cc3b85bf3d19b17f7"} Dec 04 18:30:52 crc kubenswrapper[4948]: I1204 18:30:52.701585 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8wmds" podStartSLOduration=2.054419078 podStartE2EDuration="7.701554588s" podCreationTimestamp="2025-12-04 18:30:45 +0000 UTC" firstStartedPulling="2025-12-04 18:30:46.605402279 +0000 UTC m=+3857.966476681" lastFinishedPulling="2025-12-04 18:30:52.252537779 +0000 UTC m=+3863.613612191" observedRunningTime="2025-12-04 18:30:52.693062848 +0000 UTC m=+3864.054137250" watchObservedRunningTime="2025-12-04 18:30:52.701554588 +0000 UTC m=+3864.062629020" Dec 04 18:30:55 crc kubenswrapper[4948]: I1204 18:30:55.452196 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:55 crc kubenswrapper[4948]: I1204 18:30:55.452289 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:30:55 crc kubenswrapper[4948]: I1204 18:30:55.511294 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:31:01 crc kubenswrapper[4948]: I1204 18:31:01.914726 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:31:01 crc kubenswrapper[4948]: E1204 18:31:01.915989 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:31:05 crc kubenswrapper[4948]: I1204 18:31:05.526291 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8wmds" Dec 04 18:31:05 crc kubenswrapper[4948]: I1204 18:31:05.612132 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8wmds"] Dec 04 18:31:05 crc kubenswrapper[4948]: I1204 18:31:05.668822 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vv4h6"] Dec 04 18:31:05 crc kubenswrapper[4948]: I1204 18:31:05.669144 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vv4h6" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="registry-server" containerID="cri-o://6da28e26da57b5b687704f2a42b7f467ac3cf4393aaf72df6b7a545c282431e7" gracePeriod=2 Dec 04 18:31:06 crc kubenswrapper[4948]: I1204 18:31:06.780530 4948 generic.go:334] "Generic (PLEG): container finished" podID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerID="6da28e26da57b5b687704f2a42b7f467ac3cf4393aaf72df6b7a545c282431e7" exitCode=0 Dec 04 18:31:06 crc kubenswrapper[4948]: I1204 18:31:06.780601 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vv4h6" event={"ID":"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8","Type":"ContainerDied","Data":"6da28e26da57b5b687704f2a42b7f467ac3cf4393aaf72df6b7a545c282431e7"} Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.202272 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.307153 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x429r\" (UniqueName: \"kubernetes.io/projected/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-kube-api-access-x429r\") pod \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.307283 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-utilities\") pod \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.307328 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-catalog-content\") pod \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\" (UID: \"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8\") " Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.308822 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-utilities" (OuterVolumeSpecName: "utilities") pod "8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" (UID: "8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.326408 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-kube-api-access-x429r" (OuterVolumeSpecName: "kube-api-access-x429r") pod "8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" (UID: "8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8"). InnerVolumeSpecName "kube-api-access-x429r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.374249 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" (UID: "8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.408546 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x429r\" (UniqueName: \"kubernetes.io/projected/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-kube-api-access-x429r\") on node \"crc\" DevicePath \"\"" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.408584 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.408593 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.789930 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vv4h6" event={"ID":"8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8","Type":"ContainerDied","Data":"ec5cde05c72fc4e863308c1b66d076069ba84b558a01078071eb2400afb15397"} Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.790018 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vv4h6" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.790241 4948 scope.go:117] "RemoveContainer" containerID="6da28e26da57b5b687704f2a42b7f467ac3cf4393aaf72df6b7a545c282431e7" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.812761 4948 scope.go:117] "RemoveContainer" containerID="94c6a4fa8ad8e6c07e199e78f5b35e01ea920cd56249614d724c43d28cca129b" Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.834421 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vv4h6"] Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.839606 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vv4h6"] Dec 04 18:31:07 crc kubenswrapper[4948]: I1204 18:31:07.853227 4948 scope.go:117] "RemoveContainer" containerID="cf66e9bcd5504a9e28d38570957b397f95bdc1f64120bad681fe6e3dec324426" Dec 04 18:31:08 crc kubenswrapper[4948]: I1204 18:31:08.928698 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" path="/var/lib/kubelet/pods/8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8/volumes" Dec 04 18:31:14 crc kubenswrapper[4948]: I1204 18:31:14.914519 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:31:14 crc kubenswrapper[4948]: E1204 18:31:14.917189 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:31:27 crc kubenswrapper[4948]: I1204 18:31:27.914286 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:31:27 crc kubenswrapper[4948]: E1204 18:31:27.915145 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:31:40 crc kubenswrapper[4948]: I1204 18:31:40.914638 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:31:40 crc kubenswrapper[4948]: E1204 18:31:40.915549 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:31:54 crc kubenswrapper[4948]: I1204 18:31:54.914685 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:31:54 crc kubenswrapper[4948]: E1204 18:31:54.915961 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:32:05 crc kubenswrapper[4948]: I1204 18:32:05.913926 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:32:05 crc kubenswrapper[4948]: E1204 18:32:05.915087 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:32:19 crc kubenswrapper[4948]: I1204 18:32:19.914238 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:32:20 crc kubenswrapper[4948]: I1204 18:32:20.360319 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"21d4f263ca2876fbc4b3ebad1535ac86933161b38d527e886bfb4986a8950bf3"} Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.848594 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zqkrb"] Dec 04 18:32:42 crc kubenswrapper[4948]: E1204 18:32:42.849427 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="extract-content" Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.849444 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="extract-content" Dec 04 18:32:42 crc kubenswrapper[4948]: E1204 18:32:42.849482 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="registry-server" Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.849488 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="registry-server" Dec 04 18:32:42 crc kubenswrapper[4948]: E1204 18:32:42.849498 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="extract-utilities" Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.849504 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="extract-utilities" Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.849654 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad8ca5b-ccf2-4983-b75a-db81ed78f0f8" containerName="registry-server" Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.850629 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.866938 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqkrb"] Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.949686 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q6jf\" (UniqueName: \"kubernetes.io/projected/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-kube-api-access-7q6jf\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.949753 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-catalog-content\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:42 crc kubenswrapper[4948]: I1204 18:32:42.949937 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-utilities\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:43 crc kubenswrapper[4948]: I1204 18:32:43.050829 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q6jf\" (UniqueName: \"kubernetes.io/projected/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-kube-api-access-7q6jf\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:43 crc kubenswrapper[4948]: I1204 18:32:43.050887 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-catalog-content\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:43 crc kubenswrapper[4948]: I1204 18:32:43.050927 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-utilities\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:43 crc kubenswrapper[4948]: I1204 18:32:43.051460 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-utilities\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:43 crc kubenswrapper[4948]: I1204 18:32:43.051596 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-catalog-content\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:43 crc kubenswrapper[4948]: I1204 18:32:43.085836 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q6jf\" (UniqueName: \"kubernetes.io/projected/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-kube-api-access-7q6jf\") pod \"redhat-marketplace-zqkrb\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:43 crc kubenswrapper[4948]: I1204 18:32:43.212268 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:43 crc kubenswrapper[4948]: I1204 18:32:43.738192 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqkrb"] Dec 04 18:32:44 crc kubenswrapper[4948]: I1204 18:32:44.574305 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqkrb" event={"ID":"43a0c848-c9b2-4fd4-ab92-2077f377aeb0","Type":"ContainerStarted","Data":"24c0272c7985c52f72f0c53b1debb25775c676fe814bd6f1e1e17efbb753739b"} Dec 04 18:32:44 crc kubenswrapper[4948]: I1204 18:32:44.574350 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqkrb" event={"ID":"43a0c848-c9b2-4fd4-ab92-2077f377aeb0","Type":"ContainerStarted","Data":"ff98edab254aac235ed428c348994bec2e1f59ad29ad5671a9702845a9189e6e"} Dec 04 18:32:45 crc kubenswrapper[4948]: I1204 18:32:45.582327 4948 generic.go:334] "Generic (PLEG): container finished" podID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerID="24c0272c7985c52f72f0c53b1debb25775c676fe814bd6f1e1e17efbb753739b" exitCode=0 Dec 04 18:32:45 crc kubenswrapper[4948]: I1204 18:32:45.582447 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqkrb" event={"ID":"43a0c848-c9b2-4fd4-ab92-2077f377aeb0","Type":"ContainerDied","Data":"24c0272c7985c52f72f0c53b1debb25775c676fe814bd6f1e1e17efbb753739b"} Dec 04 18:32:46 crc kubenswrapper[4948]: I1204 18:32:46.593541 4948 generic.go:334] "Generic (PLEG): container finished" podID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerID="b6bbadc856f1121809e95865436b86ee44fc7d6bef0fef4b3689742d8701dfe6" exitCode=0 Dec 04 18:32:46 crc kubenswrapper[4948]: I1204 18:32:46.593594 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqkrb" event={"ID":"43a0c848-c9b2-4fd4-ab92-2077f377aeb0","Type":"ContainerDied","Data":"b6bbadc856f1121809e95865436b86ee44fc7d6bef0fef4b3689742d8701dfe6"} Dec 04 18:32:47 crc kubenswrapper[4948]: I1204 18:32:47.606744 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqkrb" event={"ID":"43a0c848-c9b2-4fd4-ab92-2077f377aeb0","Type":"ContainerStarted","Data":"178dc73ee1a3656541587496666386179fb62af8fa3ce8af802d3e1bdd9e5bb0"} Dec 04 18:32:47 crc kubenswrapper[4948]: I1204 18:32:47.638537 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zqkrb" podStartSLOduration=4.260296213 podStartE2EDuration="5.638509827s" podCreationTimestamp="2025-12-04 18:32:42 +0000 UTC" firstStartedPulling="2025-12-04 18:32:45.584005371 +0000 UTC m=+3976.945079773" lastFinishedPulling="2025-12-04 18:32:46.962218975 +0000 UTC m=+3978.323293387" observedRunningTime="2025-12-04 18:32:47.628426611 +0000 UTC m=+3978.989501083" watchObservedRunningTime="2025-12-04 18:32:47.638509827 +0000 UTC m=+3978.999584259" Dec 04 18:32:53 crc kubenswrapper[4948]: I1204 18:32:53.212923 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:53 crc kubenswrapper[4948]: I1204 18:32:53.213450 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:53 crc kubenswrapper[4948]: I1204 18:32:53.272947 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:32:53 crc kubenswrapper[4948]: I1204 18:32:53.715424 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:33:02 crc kubenswrapper[4948]: I1204 18:33:02.487649 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqkrb"] Dec 04 18:33:02 crc kubenswrapper[4948]: I1204 18:33:02.489495 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zqkrb" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerName="registry-server" containerID="cri-o://178dc73ee1a3656541587496666386179fb62af8fa3ce8af802d3e1bdd9e5bb0" gracePeriod=2 Dec 04 18:33:02 crc kubenswrapper[4948]: I1204 18:33:02.746241 4948 generic.go:334] "Generic (PLEG): container finished" podID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerID="178dc73ee1a3656541587496666386179fb62af8fa3ce8af802d3e1bdd9e5bb0" exitCode=0 Dec 04 18:33:02 crc kubenswrapper[4948]: I1204 18:33:02.746295 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqkrb" event={"ID":"43a0c848-c9b2-4fd4-ab92-2077f377aeb0","Type":"ContainerDied","Data":"178dc73ee1a3656541587496666386179fb62af8fa3ce8af802d3e1bdd9e5bb0"} Dec 04 18:33:02 crc kubenswrapper[4948]: I1204 18:33:02.928856 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.085804 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q6jf\" (UniqueName: \"kubernetes.io/projected/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-kube-api-access-7q6jf\") pod \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.085943 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-catalog-content\") pod \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.086140 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-utilities\") pod \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\" (UID: \"43a0c848-c9b2-4fd4-ab92-2077f377aeb0\") " Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.087635 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-utilities" (OuterVolumeSpecName: "utilities") pod "43a0c848-c9b2-4fd4-ab92-2077f377aeb0" (UID: "43a0c848-c9b2-4fd4-ab92-2077f377aeb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.092108 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-kube-api-access-7q6jf" (OuterVolumeSpecName: "kube-api-access-7q6jf") pod "43a0c848-c9b2-4fd4-ab92-2077f377aeb0" (UID: "43a0c848-c9b2-4fd4-ab92-2077f377aeb0"). InnerVolumeSpecName "kube-api-access-7q6jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.108617 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43a0c848-c9b2-4fd4-ab92-2077f377aeb0" (UID: "43a0c848-c9b2-4fd4-ab92-2077f377aeb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.187953 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.187999 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q6jf\" (UniqueName: \"kubernetes.io/projected/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-kube-api-access-7q6jf\") on node \"crc\" DevicePath \"\"" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.188015 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43a0c848-c9b2-4fd4-ab92-2077f377aeb0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.761413 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zqkrb" event={"ID":"43a0c848-c9b2-4fd4-ab92-2077f377aeb0","Type":"ContainerDied","Data":"ff98edab254aac235ed428c348994bec2e1f59ad29ad5671a9702845a9189e6e"} Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.761503 4948 scope.go:117] "RemoveContainer" containerID="178dc73ee1a3656541587496666386179fb62af8fa3ce8af802d3e1bdd9e5bb0" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.761529 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zqkrb" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.801412 4948 scope.go:117] "RemoveContainer" containerID="b6bbadc856f1121809e95865436b86ee44fc7d6bef0fef4b3689742d8701dfe6" Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.825552 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqkrb"] Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.836740 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zqkrb"] Dec 04 18:33:03 crc kubenswrapper[4948]: I1204 18:33:03.844639 4948 scope.go:117] "RemoveContainer" containerID="24c0272c7985c52f72f0c53b1debb25775c676fe814bd6f1e1e17efbb753739b" Dec 04 18:33:04 crc kubenswrapper[4948]: I1204 18:33:04.928089 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" path="/var/lib/kubelet/pods/43a0c848-c9b2-4fd4-ab92-2077f377aeb0/volumes" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.511614 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9z27p"] Dec 04 18:34:39 crc kubenswrapper[4948]: E1204 18:34:39.512917 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerName="extract-utilities" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.512950 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerName="extract-utilities" Dec 04 18:34:39 crc kubenswrapper[4948]: E1204 18:34:39.513000 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerName="registry-server" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.513020 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerName="registry-server" Dec 04 18:34:39 crc kubenswrapper[4948]: E1204 18:34:39.513096 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerName="extract-content" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.513115 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerName="extract-content" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.513470 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a0c848-c9b2-4fd4-ab92-2077f377aeb0" containerName="registry-server" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.515812 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.526455 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9z27p"] Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.594515 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-utilities\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.595491 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m99p7\" (UniqueName: \"kubernetes.io/projected/318cff0e-0061-4568-9d15-1d4d13f9b1ce-kube-api-access-m99p7\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.595554 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-catalog-content\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.696813 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-catalog-content\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.696892 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-utilities\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.696977 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m99p7\" (UniqueName: \"kubernetes.io/projected/318cff0e-0061-4568-9d15-1d4d13f9b1ce-kube-api-access-m99p7\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.697788 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-catalog-content\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.697889 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-utilities\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.719270 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m99p7\" (UniqueName: \"kubernetes.io/projected/318cff0e-0061-4568-9d15-1d4d13f9b1ce-kube-api-access-m99p7\") pod \"redhat-operators-9z27p\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:39 crc kubenswrapper[4948]: I1204 18:34:39.893949 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:40 crc kubenswrapper[4948]: I1204 18:34:40.316488 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9z27p"] Dec 04 18:34:40 crc kubenswrapper[4948]: W1204 18:34:40.321758 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod318cff0e_0061_4568_9d15_1d4d13f9b1ce.slice/crio-5e309307ae5ad373743810bc1d9b18b7b4f0c71f5d2e0de848a3d7b2bcd3c290 WatchSource:0}: Error finding container 5e309307ae5ad373743810bc1d9b18b7b4f0c71f5d2e0de848a3d7b2bcd3c290: Status 404 returned error can't find the container with id 5e309307ae5ad373743810bc1d9b18b7b4f0c71f5d2e0de848a3d7b2bcd3c290 Dec 04 18:34:40 crc kubenswrapper[4948]: I1204 18:34:40.637929 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:34:40 crc kubenswrapper[4948]: I1204 18:34:40.638000 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:34:40 crc kubenswrapper[4948]: I1204 18:34:40.644836 4948 generic.go:334] "Generic (PLEG): container finished" podID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerID="a153345390fe40d11f81c437403dc772425ee7af0124237bd62127e5007c2fde" exitCode=0 Dec 04 18:34:40 crc kubenswrapper[4948]: I1204 18:34:40.644887 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z27p" event={"ID":"318cff0e-0061-4568-9d15-1d4d13f9b1ce","Type":"ContainerDied","Data":"a153345390fe40d11f81c437403dc772425ee7af0124237bd62127e5007c2fde"} Dec 04 18:34:40 crc kubenswrapper[4948]: I1204 18:34:40.644941 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z27p" event={"ID":"318cff0e-0061-4568-9d15-1d4d13f9b1ce","Type":"ContainerStarted","Data":"5e309307ae5ad373743810bc1d9b18b7b4f0c71f5d2e0de848a3d7b2bcd3c290"} Dec 04 18:34:41 crc kubenswrapper[4948]: I1204 18:34:41.653826 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z27p" event={"ID":"318cff0e-0061-4568-9d15-1d4d13f9b1ce","Type":"ContainerStarted","Data":"8cd01e9d989bae10da341f7b5ac3bbb25577b9a4681c196d53951c220efe6c81"} Dec 04 18:34:42 crc kubenswrapper[4948]: I1204 18:34:42.662793 4948 generic.go:334] "Generic (PLEG): container finished" podID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerID="8cd01e9d989bae10da341f7b5ac3bbb25577b9a4681c196d53951c220efe6c81" exitCode=0 Dec 04 18:34:42 crc kubenswrapper[4948]: I1204 18:34:42.662909 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z27p" event={"ID":"318cff0e-0061-4568-9d15-1d4d13f9b1ce","Type":"ContainerDied","Data":"8cd01e9d989bae10da341f7b5ac3bbb25577b9a4681c196d53951c220efe6c81"} Dec 04 18:34:43 crc kubenswrapper[4948]: I1204 18:34:43.673843 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z27p" event={"ID":"318cff0e-0061-4568-9d15-1d4d13f9b1ce","Type":"ContainerStarted","Data":"3f11eb14de08c6ad91872692c5ce4f4cc84a04756acdd769452fbf57b84a6816"} Dec 04 18:34:43 crc kubenswrapper[4948]: I1204 18:34:43.701484 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9z27p" podStartSLOduration=2.229623095 podStartE2EDuration="4.701461268s" podCreationTimestamp="2025-12-04 18:34:39 +0000 UTC" firstStartedPulling="2025-12-04 18:34:40.646244864 +0000 UTC m=+4092.007319266" lastFinishedPulling="2025-12-04 18:34:43.118083017 +0000 UTC m=+4094.479157439" observedRunningTime="2025-12-04 18:34:43.700679956 +0000 UTC m=+4095.061754368" watchObservedRunningTime="2025-12-04 18:34:43.701461268 +0000 UTC m=+4095.062535680" Dec 04 18:34:49 crc kubenswrapper[4948]: I1204 18:34:49.895089 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:49 crc kubenswrapper[4948]: I1204 18:34:49.895737 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:49 crc kubenswrapper[4948]: I1204 18:34:49.980915 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:50 crc kubenswrapper[4948]: I1204 18:34:50.851233 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:50 crc kubenswrapper[4948]: I1204 18:34:50.937873 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9z27p"] Dec 04 18:34:52 crc kubenswrapper[4948]: I1204 18:34:52.756019 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9z27p" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerName="registry-server" containerID="cri-o://3f11eb14de08c6ad91872692c5ce4f4cc84a04756acdd769452fbf57b84a6816" gracePeriod=2 Dec 04 18:34:54 crc kubenswrapper[4948]: I1204 18:34:54.778863 4948 generic.go:334] "Generic (PLEG): container finished" podID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerID="3f11eb14de08c6ad91872692c5ce4f4cc84a04756acdd769452fbf57b84a6816" exitCode=0 Dec 04 18:34:54 crc kubenswrapper[4948]: I1204 18:34:54.779074 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z27p" event={"ID":"318cff0e-0061-4568-9d15-1d4d13f9b1ce","Type":"ContainerDied","Data":"3f11eb14de08c6ad91872692c5ce4f4cc84a04756acdd769452fbf57b84a6816"} Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.058092 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.222315 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-utilities\") pod \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.222372 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m99p7\" (UniqueName: \"kubernetes.io/projected/318cff0e-0061-4568-9d15-1d4d13f9b1ce-kube-api-access-m99p7\") pod \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.222402 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-catalog-content\") pod \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\" (UID: \"318cff0e-0061-4568-9d15-1d4d13f9b1ce\") " Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.223669 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-utilities" (OuterVolumeSpecName: "utilities") pod "318cff0e-0061-4568-9d15-1d4d13f9b1ce" (UID: "318cff0e-0061-4568-9d15-1d4d13f9b1ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.229650 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/318cff0e-0061-4568-9d15-1d4d13f9b1ce-kube-api-access-m99p7" (OuterVolumeSpecName: "kube-api-access-m99p7") pod "318cff0e-0061-4568-9d15-1d4d13f9b1ce" (UID: "318cff0e-0061-4568-9d15-1d4d13f9b1ce"). InnerVolumeSpecName "kube-api-access-m99p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.324732 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.324808 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m99p7\" (UniqueName: \"kubernetes.io/projected/318cff0e-0061-4568-9d15-1d4d13f9b1ce-kube-api-access-m99p7\") on node \"crc\" DevicePath \"\"" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.376188 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "318cff0e-0061-4568-9d15-1d4d13f9b1ce" (UID: "318cff0e-0061-4568-9d15-1d4d13f9b1ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.426173 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/318cff0e-0061-4568-9d15-1d4d13f9b1ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.792524 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z27p" event={"ID":"318cff0e-0061-4568-9d15-1d4d13f9b1ce","Type":"ContainerDied","Data":"5e309307ae5ad373743810bc1d9b18b7b4f0c71f5d2e0de848a3d7b2bcd3c290"} Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.792612 4948 scope.go:117] "RemoveContainer" containerID="3f11eb14de08c6ad91872692c5ce4f4cc84a04756acdd769452fbf57b84a6816" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.792665 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z27p" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.818507 4948 scope.go:117] "RemoveContainer" containerID="8cd01e9d989bae10da341f7b5ac3bbb25577b9a4681c196d53951c220efe6c81" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.842223 4948 scope.go:117] "RemoveContainer" containerID="a153345390fe40d11f81c437403dc772425ee7af0124237bd62127e5007c2fde" Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.854377 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9z27p"] Dec 04 18:34:55 crc kubenswrapper[4948]: I1204 18:34:55.866650 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9z27p"] Dec 04 18:34:56 crc kubenswrapper[4948]: I1204 18:34:56.930953 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" path="/var/lib/kubelet/pods/318cff0e-0061-4568-9d15-1d4d13f9b1ce/volumes" Dec 04 18:35:10 crc kubenswrapper[4948]: I1204 18:35:10.625029 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:35:10 crc kubenswrapper[4948]: I1204 18:35:10.625668 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:35:40 crc kubenswrapper[4948]: I1204 18:35:40.625639 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:35:40 crc kubenswrapper[4948]: I1204 18:35:40.626468 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:35:40 crc kubenswrapper[4948]: I1204 18:35:40.626554 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:35:40 crc kubenswrapper[4948]: I1204 18:35:40.627632 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21d4f263ca2876fbc4b3ebad1535ac86933161b38d527e886bfb4986a8950bf3"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:35:40 crc kubenswrapper[4948]: I1204 18:35:40.627783 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://21d4f263ca2876fbc4b3ebad1535ac86933161b38d527e886bfb4986a8950bf3" gracePeriod=600 Dec 04 18:35:41 crc kubenswrapper[4948]: I1204 18:35:41.193559 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="21d4f263ca2876fbc4b3ebad1535ac86933161b38d527e886bfb4986a8950bf3" exitCode=0 Dec 04 18:35:41 crc kubenswrapper[4948]: I1204 18:35:41.193899 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"21d4f263ca2876fbc4b3ebad1535ac86933161b38d527e886bfb4986a8950bf3"} Dec 04 18:35:41 crc kubenswrapper[4948]: I1204 18:35:41.193937 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6"} Dec 04 18:35:41 crc kubenswrapper[4948]: I1204 18:35:41.193962 4948 scope.go:117] "RemoveContainer" containerID="a3534725d5bda041e4c1e4590fbbf0d4339a184a10712917dab7a40da27e2115" Dec 04 18:37:40 crc kubenswrapper[4948]: I1204 18:37:40.625753 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:37:40 crc kubenswrapper[4948]: I1204 18:37:40.626631 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:38:10 crc kubenswrapper[4948]: I1204 18:38:10.624581 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:38:10 crc kubenswrapper[4948]: I1204 18:38:10.626084 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:38:40 crc kubenswrapper[4948]: I1204 18:38:40.625374 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:38:40 crc kubenswrapper[4948]: I1204 18:38:40.626024 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:38:40 crc kubenswrapper[4948]: I1204 18:38:40.626147 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:38:40 crc kubenswrapper[4948]: I1204 18:38:40.626909 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:38:40 crc kubenswrapper[4948]: I1204 18:38:40.626991 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" gracePeriod=600 Dec 04 18:38:40 crc kubenswrapper[4948]: E1204 18:38:40.762682 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:38:41 crc kubenswrapper[4948]: I1204 18:38:41.315859 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" exitCode=0 Dec 04 18:38:41 crc kubenswrapper[4948]: I1204 18:38:41.315932 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6"} Dec 04 18:38:41 crc kubenswrapper[4948]: I1204 18:38:41.316101 4948 scope.go:117] "RemoveContainer" containerID="21d4f263ca2876fbc4b3ebad1535ac86933161b38d527e886bfb4986a8950bf3" Dec 04 18:38:41 crc kubenswrapper[4948]: I1204 18:38:41.316813 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:38:41 crc kubenswrapper[4948]: E1204 18:38:41.317140 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:38:52 crc kubenswrapper[4948]: I1204 18:38:52.913581 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:38:52 crc kubenswrapper[4948]: E1204 18:38:52.914726 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:39:03 crc kubenswrapper[4948]: I1204 18:39:03.914081 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:39:03 crc kubenswrapper[4948]: E1204 18:39:03.914918 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:39:15 crc kubenswrapper[4948]: I1204 18:39:15.914193 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:39:15 crc kubenswrapper[4948]: E1204 18:39:15.915469 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:39:29 crc kubenswrapper[4948]: I1204 18:39:29.914540 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:39:29 crc kubenswrapper[4948]: E1204 18:39:29.915803 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:39:43 crc kubenswrapper[4948]: I1204 18:39:43.914299 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:39:43 crc kubenswrapper[4948]: E1204 18:39:43.915098 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:39:58 crc kubenswrapper[4948]: I1204 18:39:58.918561 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:39:58 crc kubenswrapper[4948]: E1204 18:39:58.919422 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:40:11 crc kubenswrapper[4948]: I1204 18:40:11.915248 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:40:11 crc kubenswrapper[4948]: E1204 18:40:11.916723 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:40:23 crc kubenswrapper[4948]: I1204 18:40:23.915175 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:40:23 crc kubenswrapper[4948]: E1204 18:40:23.915897 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:40:38 crc kubenswrapper[4948]: I1204 18:40:38.917180 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:40:38 crc kubenswrapper[4948]: E1204 18:40:38.917823 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:40:49 crc kubenswrapper[4948]: I1204 18:40:49.914249 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:40:49 crc kubenswrapper[4948]: E1204 18:40:49.915216 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:41:04 crc kubenswrapper[4948]: I1204 18:41:04.913434 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:41:04 crc kubenswrapper[4948]: E1204 18:41:04.914152 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:41:16 crc kubenswrapper[4948]: I1204 18:41:16.914341 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:41:16 crc kubenswrapper[4948]: E1204 18:41:16.915764 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:41:28 crc kubenswrapper[4948]: I1204 18:41:28.920717 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:41:28 crc kubenswrapper[4948]: E1204 18:41:28.923299 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:41:42 crc kubenswrapper[4948]: I1204 18:41:42.914903 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:41:42 crc kubenswrapper[4948]: E1204 18:41:42.916034 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:41:57 crc kubenswrapper[4948]: I1204 18:41:57.914931 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:41:57 crc kubenswrapper[4948]: E1204 18:41:57.916164 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:42:08 crc kubenswrapper[4948]: I1204 18:42:08.922460 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:42:08 crc kubenswrapper[4948]: E1204 18:42:08.931635 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.174129 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-szx4s"] Dec 04 18:42:17 crc kubenswrapper[4948]: E1204 18:42:17.175339 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerName="extract-utilities" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.175364 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerName="extract-utilities" Dec 04 18:42:17 crc kubenswrapper[4948]: E1204 18:42:17.175379 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerName="registry-server" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.175392 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerName="registry-server" Dec 04 18:42:17 crc kubenswrapper[4948]: E1204 18:42:17.175453 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerName="extract-content" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.175468 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerName="extract-content" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.175739 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="318cff0e-0061-4568-9d15-1d4d13f9b1ce" containerName="registry-server" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.177844 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.187471 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szx4s"] Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.363278 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-utilities\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.363671 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-catalog-content\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.363824 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xf6\" (UniqueName: \"kubernetes.io/projected/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-kube-api-access-h5xf6\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.465552 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xf6\" (UniqueName: \"kubernetes.io/projected/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-kube-api-access-h5xf6\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.465859 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-utilities\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.465992 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-catalog-content\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.466415 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-catalog-content\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.466531 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-utilities\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.490599 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xf6\" (UniqueName: \"kubernetes.io/projected/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-kube-api-access-h5xf6\") pod \"community-operators-szx4s\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.515977 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:17 crc kubenswrapper[4948]: I1204 18:42:17.993713 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szx4s"] Dec 04 18:42:17 crc kubenswrapper[4948]: W1204 18:42:17.997448 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65c1ef6_6242_4d4d_8a72_6ecbeed7d6f1.slice/crio-3d2e6363c7576a55d57736a5488ef561c58e2ecd3dadc17543426b0fbb103467 WatchSource:0}: Error finding container 3d2e6363c7576a55d57736a5488ef561c58e2ecd3dadc17543426b0fbb103467: Status 404 returned error can't find the container with id 3d2e6363c7576a55d57736a5488ef561c58e2ecd3dadc17543426b0fbb103467 Dec 04 18:42:18 crc kubenswrapper[4948]: I1204 18:42:18.286614 4948 generic.go:334] "Generic (PLEG): container finished" podID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerID="e34c992399f02c489949569528809a43724f896be420afb309772ae26f8cab3b" exitCode=0 Dec 04 18:42:18 crc kubenswrapper[4948]: I1204 18:42:18.286695 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4s" event={"ID":"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1","Type":"ContainerDied","Data":"e34c992399f02c489949569528809a43724f896be420afb309772ae26f8cab3b"} Dec 04 18:42:18 crc kubenswrapper[4948]: I1204 18:42:18.286747 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4s" event={"ID":"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1","Type":"ContainerStarted","Data":"3d2e6363c7576a55d57736a5488ef561c58e2ecd3dadc17543426b0fbb103467"} Dec 04 18:42:18 crc kubenswrapper[4948]: I1204 18:42:18.289637 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 18:42:19 crc kubenswrapper[4948]: I1204 18:42:19.293886 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4s" event={"ID":"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1","Type":"ContainerStarted","Data":"6781f2422c3975a8b4348f97ec4f6a52f530df7945a1348185da22de25f0a2cf"} Dec 04 18:42:20 crc kubenswrapper[4948]: I1204 18:42:20.307757 4948 generic.go:334] "Generic (PLEG): container finished" podID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerID="6781f2422c3975a8b4348f97ec4f6a52f530df7945a1348185da22de25f0a2cf" exitCode=0 Dec 04 18:42:20 crc kubenswrapper[4948]: I1204 18:42:20.307836 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4s" event={"ID":"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1","Type":"ContainerDied","Data":"6781f2422c3975a8b4348f97ec4f6a52f530df7945a1348185da22de25f0a2cf"} Dec 04 18:42:20 crc kubenswrapper[4948]: I1204 18:42:20.913990 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:42:20 crc kubenswrapper[4948]: E1204 18:42:20.914545 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:42:21 crc kubenswrapper[4948]: I1204 18:42:21.322796 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4s" event={"ID":"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1","Type":"ContainerStarted","Data":"76aa78c4ea63dca1c1983bf79bbbd481ccb595caaef9c7800797c5f92d42f29a"} Dec 04 18:42:21 crc kubenswrapper[4948]: I1204 18:42:21.355504 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-szx4s" podStartSLOduration=1.92550174 podStartE2EDuration="4.355474972s" podCreationTimestamp="2025-12-04 18:42:17 +0000 UTC" firstStartedPulling="2025-12-04 18:42:18.289240142 +0000 UTC m=+4549.650314574" lastFinishedPulling="2025-12-04 18:42:20.719213374 +0000 UTC m=+4552.080287806" observedRunningTime="2025-12-04 18:42:21.352356753 +0000 UTC m=+4552.713431225" watchObservedRunningTime="2025-12-04 18:42:21.355474972 +0000 UTC m=+4552.716549424" Dec 04 18:42:27 crc kubenswrapper[4948]: I1204 18:42:27.516632 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:27 crc kubenswrapper[4948]: I1204 18:42:27.517761 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:27 crc kubenswrapper[4948]: I1204 18:42:27.594075 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:28 crc kubenswrapper[4948]: I1204 18:42:28.461729 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:29 crc kubenswrapper[4948]: I1204 18:42:29.247529 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szx4s"] Dec 04 18:42:30 crc kubenswrapper[4948]: I1204 18:42:30.449930 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-szx4s" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerName="registry-server" containerID="cri-o://76aa78c4ea63dca1c1983bf79bbbd481ccb595caaef9c7800797c5f92d42f29a" gracePeriod=2 Dec 04 18:42:31 crc kubenswrapper[4948]: I1204 18:42:31.466523 4948 generic.go:334] "Generic (PLEG): container finished" podID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerID="76aa78c4ea63dca1c1983bf79bbbd481ccb595caaef9c7800797c5f92d42f29a" exitCode=0 Dec 04 18:42:31 crc kubenswrapper[4948]: I1204 18:42:31.466631 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4s" event={"ID":"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1","Type":"ContainerDied","Data":"76aa78c4ea63dca1c1983bf79bbbd481ccb595caaef9c7800797c5f92d42f29a"} Dec 04 18:42:31 crc kubenswrapper[4948]: I1204 18:42:31.949425 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.132390 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-utilities\") pod \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.132456 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-catalog-content\") pod \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.132486 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5xf6\" (UniqueName: \"kubernetes.io/projected/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-kube-api-access-h5xf6\") pod \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\" (UID: \"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1\") " Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.133791 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-utilities" (OuterVolumeSpecName: "utilities") pod "f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" (UID: "f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.153185 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-kube-api-access-h5xf6" (OuterVolumeSpecName: "kube-api-access-h5xf6") pod "f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" (UID: "f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1"). InnerVolumeSpecName "kube-api-access-h5xf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.198670 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" (UID: "f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.234500 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.234839 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.234979 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5xf6\" (UniqueName: \"kubernetes.io/projected/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1-kube-api-access-h5xf6\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.482154 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szx4s" event={"ID":"f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1","Type":"ContainerDied","Data":"3d2e6363c7576a55d57736a5488ef561c58e2ecd3dadc17543426b0fbb103467"} Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.482224 4948 scope.go:117] "RemoveContainer" containerID="76aa78c4ea63dca1c1983bf79bbbd481ccb595caaef9c7800797c5f92d42f29a" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.483404 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szx4s" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.521716 4948 scope.go:117] "RemoveContainer" containerID="6781f2422c3975a8b4348f97ec4f6a52f530df7945a1348185da22de25f0a2cf" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.541974 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szx4s"] Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.550998 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-szx4s"] Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.567699 4948 scope.go:117] "RemoveContainer" containerID="e34c992399f02c489949569528809a43724f896be420afb309772ae26f8cab3b" Dec 04 18:42:32 crc kubenswrapper[4948]: I1204 18:42:32.926422 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" path="/var/lib/kubelet/pods/f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1/volumes" Dec 04 18:42:35 crc kubenswrapper[4948]: I1204 18:42:35.913896 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:42:35 crc kubenswrapper[4948]: E1204 18:42:35.914566 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.240524 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-drddw"] Dec 04 18:42:42 crc kubenswrapper[4948]: E1204 18:42:42.241548 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerName="registry-server" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.241580 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerName="registry-server" Dec 04 18:42:42 crc kubenswrapper[4948]: E1204 18:42:42.241620 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerName="extract-utilities" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.241640 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerName="extract-utilities" Dec 04 18:42:42 crc kubenswrapper[4948]: E1204 18:42:42.241673 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerName="extract-content" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.241692 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerName="extract-content" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.242032 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="f65c1ef6-6242-4d4d-8a72-6ecbeed7d6f1" containerName="registry-server" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.244612 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.263598 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-drddw"] Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.290619 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-catalog-content\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.290874 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkrkr\" (UniqueName: \"kubernetes.io/projected/7220bc19-dc91-45bd-9ea8-8726eff281bb-kube-api-access-gkrkr\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.290977 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-utilities\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.391624 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-catalog-content\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.391738 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkrkr\" (UniqueName: \"kubernetes.io/projected/7220bc19-dc91-45bd-9ea8-8726eff281bb-kube-api-access-gkrkr\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.391767 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-utilities\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.392347 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-utilities\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.392359 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-catalog-content\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.416376 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkrkr\" (UniqueName: \"kubernetes.io/projected/7220bc19-dc91-45bd-9ea8-8726eff281bb-kube-api-access-gkrkr\") pod \"certified-operators-drddw\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.447325 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tbwfn"] Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.449387 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.456572 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbwfn"] Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.583802 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.594348 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwpdv\" (UniqueName: \"kubernetes.io/projected/2aef5269-857b-422e-9192-5b0b2852b1a8-kube-api-access-fwpdv\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.594391 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-utilities\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.594569 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-catalog-content\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.696149 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwpdv\" (UniqueName: \"kubernetes.io/projected/2aef5269-857b-422e-9192-5b0b2852b1a8-kube-api-access-fwpdv\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.696198 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-utilities\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.696268 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-catalog-content\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.696811 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-catalog-content\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.697166 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-utilities\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.733981 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwpdv\" (UniqueName: \"kubernetes.io/projected/2aef5269-857b-422e-9192-5b0b2852b1a8-kube-api-access-fwpdv\") pod \"redhat-marketplace-tbwfn\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:42 crc kubenswrapper[4948]: I1204 18:42:42.776501 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:43 crc kubenswrapper[4948]: I1204 18:42:43.060548 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-drddw"] Dec 04 18:42:43 crc kubenswrapper[4948]: I1204 18:42:43.226880 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbwfn"] Dec 04 18:42:43 crc kubenswrapper[4948]: W1204 18:42:43.228870 4948 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aef5269_857b_422e_9192_5b0b2852b1a8.slice/crio-9ee0175018d6b65524c3db913a54ff54b86498d38fbc8530c9fe3baa34cea3c5 WatchSource:0}: Error finding container 9ee0175018d6b65524c3db913a54ff54b86498d38fbc8530c9fe3baa34cea3c5: Status 404 returned error can't find the container with id 9ee0175018d6b65524c3db913a54ff54b86498d38fbc8530c9fe3baa34cea3c5 Dec 04 18:42:43 crc kubenswrapper[4948]: I1204 18:42:43.594610 4948 generic.go:334] "Generic (PLEG): container finished" podID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerID="e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c" exitCode=0 Dec 04 18:42:43 crc kubenswrapper[4948]: I1204 18:42:43.594690 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drddw" event={"ID":"7220bc19-dc91-45bd-9ea8-8726eff281bb","Type":"ContainerDied","Data":"e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c"} Dec 04 18:42:43 crc kubenswrapper[4948]: I1204 18:42:43.594973 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drddw" event={"ID":"7220bc19-dc91-45bd-9ea8-8726eff281bb","Type":"ContainerStarted","Data":"e07f01c23953895e6ea192b8339bf500f15dcc5f71810b54a193fa233bbadaad"} Dec 04 18:42:43 crc kubenswrapper[4948]: I1204 18:42:43.598722 4948 generic.go:334] "Generic (PLEG): container finished" podID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerID="bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02" exitCode=0 Dec 04 18:42:43 crc kubenswrapper[4948]: I1204 18:42:43.598790 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbwfn" event={"ID":"2aef5269-857b-422e-9192-5b0b2852b1a8","Type":"ContainerDied","Data":"bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02"} Dec 04 18:42:43 crc kubenswrapper[4948]: I1204 18:42:43.598818 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbwfn" event={"ID":"2aef5269-857b-422e-9192-5b0b2852b1a8","Type":"ContainerStarted","Data":"9ee0175018d6b65524c3db913a54ff54b86498d38fbc8530c9fe3baa34cea3c5"} Dec 04 18:42:44 crc kubenswrapper[4948]: I1204 18:42:44.620757 4948 generic.go:334] "Generic (PLEG): container finished" podID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerID="47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9" exitCode=0 Dec 04 18:42:44 crc kubenswrapper[4948]: I1204 18:42:44.620945 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbwfn" event={"ID":"2aef5269-857b-422e-9192-5b0b2852b1a8","Type":"ContainerDied","Data":"47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9"} Dec 04 18:42:44 crc kubenswrapper[4948]: I1204 18:42:44.625937 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drddw" event={"ID":"7220bc19-dc91-45bd-9ea8-8726eff281bb","Type":"ContainerStarted","Data":"b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823"} Dec 04 18:42:45 crc kubenswrapper[4948]: I1204 18:42:45.633096 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbwfn" event={"ID":"2aef5269-857b-422e-9192-5b0b2852b1a8","Type":"ContainerStarted","Data":"4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33"} Dec 04 18:42:45 crc kubenswrapper[4948]: I1204 18:42:45.635035 4948 generic.go:334] "Generic (PLEG): container finished" podID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerID="b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823" exitCode=0 Dec 04 18:42:45 crc kubenswrapper[4948]: I1204 18:42:45.635098 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drddw" event={"ID":"7220bc19-dc91-45bd-9ea8-8726eff281bb","Type":"ContainerDied","Data":"b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823"} Dec 04 18:42:45 crc kubenswrapper[4948]: I1204 18:42:45.658496 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tbwfn" podStartSLOduration=2.220022014 podStartE2EDuration="3.658469243s" podCreationTimestamp="2025-12-04 18:42:42 +0000 UTC" firstStartedPulling="2025-12-04 18:42:43.600737478 +0000 UTC m=+4574.961811890" lastFinishedPulling="2025-12-04 18:42:45.039184717 +0000 UTC m=+4576.400259119" observedRunningTime="2025-12-04 18:42:45.648814058 +0000 UTC m=+4577.009888470" watchObservedRunningTime="2025-12-04 18:42:45.658469243 +0000 UTC m=+4577.019543655" Dec 04 18:42:46 crc kubenswrapper[4948]: I1204 18:42:46.644810 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drddw" event={"ID":"7220bc19-dc91-45bd-9ea8-8726eff281bb","Type":"ContainerStarted","Data":"868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198"} Dec 04 18:42:46 crc kubenswrapper[4948]: I1204 18:42:46.670259 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-drddw" podStartSLOduration=2.261165062 podStartE2EDuration="4.67024194s" podCreationTimestamp="2025-12-04 18:42:42 +0000 UTC" firstStartedPulling="2025-12-04 18:42:43.596426175 +0000 UTC m=+4574.957500587" lastFinishedPulling="2025-12-04 18:42:46.005503023 +0000 UTC m=+4577.366577465" observedRunningTime="2025-12-04 18:42:46.669295463 +0000 UTC m=+4578.030369875" watchObservedRunningTime="2025-12-04 18:42:46.67024194 +0000 UTC m=+4578.031316352" Dec 04 18:42:46 crc kubenswrapper[4948]: I1204 18:42:46.914569 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:42:46 crc kubenswrapper[4948]: E1204 18:42:46.914792 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:42:52 crc kubenswrapper[4948]: I1204 18:42:52.584367 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:52 crc kubenswrapper[4948]: I1204 18:42:52.585075 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:52 crc kubenswrapper[4948]: I1204 18:42:52.629797 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:52 crc kubenswrapper[4948]: I1204 18:42:52.733562 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:52 crc kubenswrapper[4948]: I1204 18:42:52.776979 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:52 crc kubenswrapper[4948]: I1204 18:42:52.777034 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:52 crc kubenswrapper[4948]: I1204 18:42:52.816177 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:53 crc kubenswrapper[4948]: I1204 18:42:53.752715 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:53 crc kubenswrapper[4948]: I1204 18:42:53.833073 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-drddw"] Dec 04 18:42:54 crc kubenswrapper[4948]: I1204 18:42:54.704475 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-drddw" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerName="registry-server" containerID="cri-o://868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198" gracePeriod=2 Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.065673 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.094394 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-utilities\") pod \"7220bc19-dc91-45bd-9ea8-8726eff281bb\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.094523 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-catalog-content\") pod \"7220bc19-dc91-45bd-9ea8-8726eff281bb\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.094575 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkrkr\" (UniqueName: \"kubernetes.io/projected/7220bc19-dc91-45bd-9ea8-8726eff281bb-kube-api-access-gkrkr\") pod \"7220bc19-dc91-45bd-9ea8-8726eff281bb\" (UID: \"7220bc19-dc91-45bd-9ea8-8726eff281bb\") " Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.095563 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-utilities" (OuterVolumeSpecName: "utilities") pod "7220bc19-dc91-45bd-9ea8-8726eff281bb" (UID: "7220bc19-dc91-45bd-9ea8-8726eff281bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.101634 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7220bc19-dc91-45bd-9ea8-8726eff281bb-kube-api-access-gkrkr" (OuterVolumeSpecName: "kube-api-access-gkrkr") pod "7220bc19-dc91-45bd-9ea8-8726eff281bb" (UID: "7220bc19-dc91-45bd-9ea8-8726eff281bb"). InnerVolumeSpecName "kube-api-access-gkrkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.196802 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkrkr\" (UniqueName: \"kubernetes.io/projected/7220bc19-dc91-45bd-9ea8-8726eff281bb-kube-api-access-gkrkr\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.196852 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.231934 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbwfn"] Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.260455 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7220bc19-dc91-45bd-9ea8-8726eff281bb" (UID: "7220bc19-dc91-45bd-9ea8-8726eff281bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.297943 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7220bc19-dc91-45bd-9ea8-8726eff281bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.721979 4948 generic.go:334] "Generic (PLEG): container finished" podID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerID="868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198" exitCode=0 Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.722091 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-drddw" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.722078 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drddw" event={"ID":"7220bc19-dc91-45bd-9ea8-8726eff281bb","Type":"ContainerDied","Data":"868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198"} Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.722227 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-drddw" event={"ID":"7220bc19-dc91-45bd-9ea8-8726eff281bb","Type":"ContainerDied","Data":"e07f01c23953895e6ea192b8339bf500f15dcc5f71810b54a193fa233bbadaad"} Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.722257 4948 scope.go:117] "RemoveContainer" containerID="868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.760378 4948 scope.go:117] "RemoveContainer" containerID="b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.761508 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-drddw"] Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.766249 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-drddw"] Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.789656 4948 scope.go:117] "RemoveContainer" containerID="e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.824788 4948 scope.go:117] "RemoveContainer" containerID="868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198" Dec 04 18:42:55 crc kubenswrapper[4948]: E1204 18:42:55.825528 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198\": container with ID starting with 868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198 not found: ID does not exist" containerID="868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.825581 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198"} err="failed to get container status \"868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198\": rpc error: code = NotFound desc = could not find container \"868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198\": container with ID starting with 868e2c7abe9c106198219a67af9843f18ce4816c1bd45a06ae5cf90995ec3198 not found: ID does not exist" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.825723 4948 scope.go:117] "RemoveContainer" containerID="b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823" Dec 04 18:42:55 crc kubenswrapper[4948]: E1204 18:42:55.826405 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823\": container with ID starting with b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823 not found: ID does not exist" containerID="b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.826428 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823"} err="failed to get container status \"b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823\": rpc error: code = NotFound desc = could not find container \"b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823\": container with ID starting with b2e7745fb0af1a594047482132a031300fc7ef40e37b0967ee92376250c2b823 not found: ID does not exist" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.826448 4948 scope.go:117] "RemoveContainer" containerID="e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c" Dec 04 18:42:55 crc kubenswrapper[4948]: E1204 18:42:55.826941 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c\": container with ID starting with e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c not found: ID does not exist" containerID="e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c" Dec 04 18:42:55 crc kubenswrapper[4948]: I1204 18:42:55.827006 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c"} err="failed to get container status \"e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c\": rpc error: code = NotFound desc = could not find container \"e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c\": container with ID starting with e24d52258a07e7ca941aef72b778cd6c848d09a108a8b037a11040dbd4a0777c not found: ID does not exist" Dec 04 18:42:56 crc kubenswrapper[4948]: I1204 18:42:56.735076 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tbwfn" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerName="registry-server" containerID="cri-o://4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33" gracePeriod=2 Dec 04 18:42:56 crc kubenswrapper[4948]: I1204 18:42:56.923969 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" path="/var/lib/kubelet/pods/7220bc19-dc91-45bd-9ea8-8726eff281bb/volumes" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.701342 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.740352 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-utilities\") pod \"2aef5269-857b-422e-9192-5b0b2852b1a8\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.740520 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-catalog-content\") pod \"2aef5269-857b-422e-9192-5b0b2852b1a8\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.740548 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwpdv\" (UniqueName: \"kubernetes.io/projected/2aef5269-857b-422e-9192-5b0b2852b1a8-kube-api-access-fwpdv\") pod \"2aef5269-857b-422e-9192-5b0b2852b1a8\" (UID: \"2aef5269-857b-422e-9192-5b0b2852b1a8\") " Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.741904 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-utilities" (OuterVolumeSpecName: "utilities") pod "2aef5269-857b-422e-9192-5b0b2852b1a8" (UID: "2aef5269-857b-422e-9192-5b0b2852b1a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.744513 4948 generic.go:334] "Generic (PLEG): container finished" podID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerID="4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33" exitCode=0 Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.744614 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbwfn" event={"ID":"2aef5269-857b-422e-9192-5b0b2852b1a8","Type":"ContainerDied","Data":"4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33"} Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.744690 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbwfn" event={"ID":"2aef5269-857b-422e-9192-5b0b2852b1a8","Type":"ContainerDied","Data":"9ee0175018d6b65524c3db913a54ff54b86498d38fbc8530c9fe3baa34cea3c5"} Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.744717 4948 scope.go:117] "RemoveContainer" containerID="4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.744591 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbwfn" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.754304 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aef5269-857b-422e-9192-5b0b2852b1a8-kube-api-access-fwpdv" (OuterVolumeSpecName: "kube-api-access-fwpdv") pod "2aef5269-857b-422e-9192-5b0b2852b1a8" (UID: "2aef5269-857b-422e-9192-5b0b2852b1a8"). InnerVolumeSpecName "kube-api-access-fwpdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.777436 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2aef5269-857b-422e-9192-5b0b2852b1a8" (UID: "2aef5269-857b-422e-9192-5b0b2852b1a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.784600 4948 scope.go:117] "RemoveContainer" containerID="47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.842861 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.842914 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwpdv\" (UniqueName: \"kubernetes.io/projected/2aef5269-857b-422e-9192-5b0b2852b1a8-kube-api-access-fwpdv\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.842937 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aef5269-857b-422e-9192-5b0b2852b1a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:42:57 crc kubenswrapper[4948]: I1204 18:42:57.914440 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:42:57 crc kubenswrapper[4948]: E1204 18:42:57.914834 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.106317 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbwfn"] Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.116840 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbwfn"] Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.309437 4948 scope.go:117] "RemoveContainer" containerID="bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02" Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.356808 4948 scope.go:117] "RemoveContainer" containerID="4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33" Dec 04 18:42:58 crc kubenswrapper[4948]: E1204 18:42:58.357529 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33\": container with ID starting with 4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33 not found: ID does not exist" containerID="4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33" Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.357578 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33"} err="failed to get container status \"4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33\": rpc error: code = NotFound desc = could not find container \"4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33\": container with ID starting with 4ef87ceeda5ab1f8e10915a7f0c4fb128eb0fcb03159e4a82b1030a8df437e33 not found: ID does not exist" Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.357609 4948 scope.go:117] "RemoveContainer" containerID="47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9" Dec 04 18:42:58 crc kubenswrapper[4948]: E1204 18:42:58.358309 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9\": container with ID starting with 47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9 not found: ID does not exist" containerID="47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9" Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.358339 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9"} err="failed to get container status \"47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9\": rpc error: code = NotFound desc = could not find container \"47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9\": container with ID starting with 47158ab18e2a185fa5a26a13538e41aa03d75945451b3db0ce46cfbbddf8b8f9 not found: ID does not exist" Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.358360 4948 scope.go:117] "RemoveContainer" containerID="bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02" Dec 04 18:42:58 crc kubenswrapper[4948]: E1204 18:42:58.358984 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02\": container with ID starting with bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02 not found: ID does not exist" containerID="bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02" Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.359100 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02"} err="failed to get container status \"bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02\": rpc error: code = NotFound desc = could not find container \"bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02\": container with ID starting with bf89b9bdc3e5ecb9bb3fd9245e543ba776b2b221f19386237074bc4fae163f02 not found: ID does not exist" Dec 04 18:42:58 crc kubenswrapper[4948]: I1204 18:42:58.933087 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" path="/var/lib/kubelet/pods/2aef5269-857b-422e-9192-5b0b2852b1a8/volumes" Dec 04 18:43:12 crc kubenswrapper[4948]: I1204 18:43:12.914608 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:43:12 crc kubenswrapper[4948]: E1204 18:43:12.915863 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:43:25 crc kubenswrapper[4948]: I1204 18:43:25.913929 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:43:25 crc kubenswrapper[4948]: E1204 18:43:25.914519 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:43:39 crc kubenswrapper[4948]: I1204 18:43:39.913823 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:43:39 crc kubenswrapper[4948]: E1204 18:43:39.914727 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:43:53 crc kubenswrapper[4948]: I1204 18:43:53.914858 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:43:54 crc kubenswrapper[4948]: I1204 18:43:54.298153 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"a29eaa42cdb0c16e7c8a62406f8650839b1b37e8186439675fc3f07033c143ae"} Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.163948 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx"] Dec 04 18:45:00 crc kubenswrapper[4948]: E1204 18:45:00.164888 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerName="extract-utilities" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.164907 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerName="extract-utilities" Dec 04 18:45:00 crc kubenswrapper[4948]: E1204 18:45:00.164925 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerName="registry-server" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.164937 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerName="registry-server" Dec 04 18:45:00 crc kubenswrapper[4948]: E1204 18:45:00.164956 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerName="registry-server" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.164964 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerName="registry-server" Dec 04 18:45:00 crc kubenswrapper[4948]: E1204 18:45:00.164983 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerName="extract-content" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.164991 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerName="extract-content" Dec 04 18:45:00 crc kubenswrapper[4948]: E1204 18:45:00.165012 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerName="extract-content" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.165020 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerName="extract-content" Dec 04 18:45:00 crc kubenswrapper[4948]: E1204 18:45:00.165039 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerName="extract-utilities" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.165098 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerName="extract-utilities" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.165566 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aef5269-857b-422e-9192-5b0b2852b1a8" containerName="registry-server" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.165635 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="7220bc19-dc91-45bd-9ea8-8726eff281bb" containerName="registry-server" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.166467 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.168660 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.169695 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.185939 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx"] Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.280762 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f89zc\" (UniqueName: \"kubernetes.io/projected/2aa08cad-efe9-4ad8-89f1-f635b501d78e-kube-api-access-f89zc\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.280822 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa08cad-efe9-4ad8-89f1-f635b501d78e-config-volume\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.280907 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa08cad-efe9-4ad8-89f1-f635b501d78e-secret-volume\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.383241 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f89zc\" (UniqueName: \"kubernetes.io/projected/2aa08cad-efe9-4ad8-89f1-f635b501d78e-kube-api-access-f89zc\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.383314 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa08cad-efe9-4ad8-89f1-f635b501d78e-config-volume\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.383423 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa08cad-efe9-4ad8-89f1-f635b501d78e-secret-volume\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.385566 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa08cad-efe9-4ad8-89f1-f635b501d78e-config-volume\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.400004 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa08cad-efe9-4ad8-89f1-f635b501d78e-secret-volume\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.409573 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f89zc\" (UniqueName: \"kubernetes.io/projected/2aa08cad-efe9-4ad8-89f1-f635b501d78e-kube-api-access-f89zc\") pod \"collect-profiles-29414565-mlkbx\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.527650 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:00 crc kubenswrapper[4948]: I1204 18:45:00.782035 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx"] Dec 04 18:45:01 crc kubenswrapper[4948]: I1204 18:45:01.246927 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" event={"ID":"2aa08cad-efe9-4ad8-89f1-f635b501d78e","Type":"ContainerStarted","Data":"4378494d45117be33564d046add68fa9fcc82ee6832d2eb47ba96941cc9ae43d"} Dec 04 18:45:02 crc kubenswrapper[4948]: I1204 18:45:02.263109 4948 generic.go:334] "Generic (PLEG): container finished" podID="2aa08cad-efe9-4ad8-89f1-f635b501d78e" containerID="ec99901cab6c196e437ba04bfa873b65af3cb50b48e6df1c3e61c0619e9a752f" exitCode=0 Dec 04 18:45:02 crc kubenswrapper[4948]: I1204 18:45:02.263191 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" event={"ID":"2aa08cad-efe9-4ad8-89f1-f635b501d78e","Type":"ContainerDied","Data":"ec99901cab6c196e437ba04bfa873b65af3cb50b48e6df1c3e61c0619e9a752f"} Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.638604 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.835789 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa08cad-efe9-4ad8-89f1-f635b501d78e-secret-volume\") pod \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.835892 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa08cad-efe9-4ad8-89f1-f635b501d78e-config-volume\") pod \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.835986 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f89zc\" (UniqueName: \"kubernetes.io/projected/2aa08cad-efe9-4ad8-89f1-f635b501d78e-kube-api-access-f89zc\") pod \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\" (UID: \"2aa08cad-efe9-4ad8-89f1-f635b501d78e\") " Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.837160 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa08cad-efe9-4ad8-89f1-f635b501d78e-config-volume" (OuterVolumeSpecName: "config-volume") pod "2aa08cad-efe9-4ad8-89f1-f635b501d78e" (UID: "2aa08cad-efe9-4ad8-89f1-f635b501d78e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.845765 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa08cad-efe9-4ad8-89f1-f635b501d78e-kube-api-access-f89zc" (OuterVolumeSpecName: "kube-api-access-f89zc") pod "2aa08cad-efe9-4ad8-89f1-f635b501d78e" (UID: "2aa08cad-efe9-4ad8-89f1-f635b501d78e"). InnerVolumeSpecName "kube-api-access-f89zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.846288 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa08cad-efe9-4ad8-89f1-f635b501d78e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2aa08cad-efe9-4ad8-89f1-f635b501d78e" (UID: "2aa08cad-efe9-4ad8-89f1-f635b501d78e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.937824 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aa08cad-efe9-4ad8-89f1-f635b501d78e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.937881 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f89zc\" (UniqueName: \"kubernetes.io/projected/2aa08cad-efe9-4ad8-89f1-f635b501d78e-kube-api-access-f89zc\") on node \"crc\" DevicePath \"\"" Dec 04 18:45:03 crc kubenswrapper[4948]: I1204 18:45:03.937905 4948 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2aa08cad-efe9-4ad8-89f1-f635b501d78e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 18:45:04 crc kubenswrapper[4948]: I1204 18:45:04.284531 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" event={"ID":"2aa08cad-efe9-4ad8-89f1-f635b501d78e","Type":"ContainerDied","Data":"4378494d45117be33564d046add68fa9fcc82ee6832d2eb47ba96941cc9ae43d"} Dec 04 18:45:04 crc kubenswrapper[4948]: I1204 18:45:04.284591 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4378494d45117be33564d046add68fa9fcc82ee6832d2eb47ba96941cc9ae43d" Dec 04 18:45:04 crc kubenswrapper[4948]: I1204 18:45:04.284662 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414565-mlkbx" Dec 04 18:45:04 crc kubenswrapper[4948]: I1204 18:45:04.716157 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4"] Dec 04 18:45:04 crc kubenswrapper[4948]: I1204 18:45:04.721186 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414520-tzhn4"] Dec 04 18:45:04 crc kubenswrapper[4948]: I1204 18:45:04.922742 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf95293-dc43-49fc-8f3f-259c3b5d2a11" path="/var/lib/kubelet/pods/8bf95293-dc43-49fc-8f3f-259c3b5d2a11/volumes" Dec 04 18:45:42 crc kubenswrapper[4948]: I1204 18:45:42.158264 4948 scope.go:117] "RemoveContainer" containerID="1849f5c57638f83dd1f1562122e4836ea898c2ee6e3703c5f4eb47622c594734" Dec 04 18:46:10 crc kubenswrapper[4948]: I1204 18:46:10.624934 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:46:10 crc kubenswrapper[4948]: I1204 18:46:10.625677 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:46:40 crc kubenswrapper[4948]: I1204 18:46:40.624855 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:46:40 crc kubenswrapper[4948]: I1204 18:46:40.625641 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:47:10 crc kubenswrapper[4948]: I1204 18:47:10.625537 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:47:10 crc kubenswrapper[4948]: I1204 18:47:10.626688 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:47:10 crc kubenswrapper[4948]: I1204 18:47:10.626770 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:47:10 crc kubenswrapper[4948]: I1204 18:47:10.627671 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a29eaa42cdb0c16e7c8a62406f8650839b1b37e8186439675fc3f07033c143ae"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:47:10 crc kubenswrapper[4948]: I1204 18:47:10.627774 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://a29eaa42cdb0c16e7c8a62406f8650839b1b37e8186439675fc3f07033c143ae" gracePeriod=600 Dec 04 18:47:10 crc kubenswrapper[4948]: I1204 18:47:10.875130 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="a29eaa42cdb0c16e7c8a62406f8650839b1b37e8186439675fc3f07033c143ae" exitCode=0 Dec 04 18:47:10 crc kubenswrapper[4948]: I1204 18:47:10.875246 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"a29eaa42cdb0c16e7c8a62406f8650839b1b37e8186439675fc3f07033c143ae"} Dec 04 18:47:10 crc kubenswrapper[4948]: I1204 18:47:10.875599 4948 scope.go:117] "RemoveContainer" containerID="160244e12df784b63b68a4f8425d95d362c5db2e4347c6814dbf1f603e8179a6" Dec 04 18:47:11 crc kubenswrapper[4948]: I1204 18:47:11.887389 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250"} Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.189867 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vpfmd"] Dec 04 18:48:16 crc kubenswrapper[4948]: E1204 18:48:16.190841 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa08cad-efe9-4ad8-89f1-f635b501d78e" containerName="collect-profiles" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.190863 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa08cad-efe9-4ad8-89f1-f635b501d78e" containerName="collect-profiles" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.191237 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa08cad-efe9-4ad8-89f1-f635b501d78e" containerName="collect-profiles" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.193174 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.197017 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpfmd"] Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.375473 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-utilities\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.375519 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tsmk\" (UniqueName: \"kubernetes.io/projected/ccf0107b-0c6a-444f-924d-f43e7a4882db-kube-api-access-7tsmk\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.375616 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-catalog-content\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.477418 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-catalog-content\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.477551 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-utilities\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.477589 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tsmk\" (UniqueName: \"kubernetes.io/projected/ccf0107b-0c6a-444f-924d-f43e7a4882db-kube-api-access-7tsmk\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.477984 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-catalog-content\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.478023 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-utilities\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.499911 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tsmk\" (UniqueName: \"kubernetes.io/projected/ccf0107b-0c6a-444f-924d-f43e7a4882db-kube-api-access-7tsmk\") pod \"redhat-operators-vpfmd\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.545112 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:16 crc kubenswrapper[4948]: I1204 18:48:16.835310 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpfmd"] Dec 04 18:48:17 crc kubenswrapper[4948]: I1204 18:48:17.549431 4948 generic.go:334] "Generic (PLEG): container finished" podID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerID="5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf" exitCode=0 Dec 04 18:48:17 crc kubenswrapper[4948]: I1204 18:48:17.549481 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpfmd" event={"ID":"ccf0107b-0c6a-444f-924d-f43e7a4882db","Type":"ContainerDied","Data":"5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf"} Dec 04 18:48:17 crc kubenswrapper[4948]: I1204 18:48:17.549715 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpfmd" event={"ID":"ccf0107b-0c6a-444f-924d-f43e7a4882db","Type":"ContainerStarted","Data":"247638a7426aaa90c29ddfd0a9069568ae52477ac4b176ffa6251e2ee4024cd3"} Dec 04 18:48:17 crc kubenswrapper[4948]: I1204 18:48:17.551760 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 18:48:18 crc kubenswrapper[4948]: I1204 18:48:18.560242 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpfmd" event={"ID":"ccf0107b-0c6a-444f-924d-f43e7a4882db","Type":"ContainerStarted","Data":"85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81"} Dec 04 18:48:19 crc kubenswrapper[4948]: I1204 18:48:19.566944 4948 generic.go:334] "Generic (PLEG): container finished" podID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerID="85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81" exitCode=0 Dec 04 18:48:19 crc kubenswrapper[4948]: I1204 18:48:19.567055 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpfmd" event={"ID":"ccf0107b-0c6a-444f-924d-f43e7a4882db","Type":"ContainerDied","Data":"85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81"} Dec 04 18:48:20 crc kubenswrapper[4948]: I1204 18:48:20.579693 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpfmd" event={"ID":"ccf0107b-0c6a-444f-924d-f43e7a4882db","Type":"ContainerStarted","Data":"62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc"} Dec 04 18:48:20 crc kubenswrapper[4948]: I1204 18:48:20.605180 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vpfmd" podStartSLOduration=2.12876597 podStartE2EDuration="4.605155118s" podCreationTimestamp="2025-12-04 18:48:16 +0000 UTC" firstStartedPulling="2025-12-04 18:48:17.55138914 +0000 UTC m=+4908.912463552" lastFinishedPulling="2025-12-04 18:48:20.027778258 +0000 UTC m=+4911.388852700" observedRunningTime="2025-12-04 18:48:20.601646168 +0000 UTC m=+4911.962720580" watchObservedRunningTime="2025-12-04 18:48:20.605155118 +0000 UTC m=+4911.966229540" Dec 04 18:48:26 crc kubenswrapper[4948]: I1204 18:48:26.546661 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:26 crc kubenswrapper[4948]: I1204 18:48:26.548905 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:27 crc kubenswrapper[4948]: I1204 18:48:27.592339 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vpfmd" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="registry-server" probeResult="failure" output=< Dec 04 18:48:27 crc kubenswrapper[4948]: timeout: failed to connect service ":50051" within 1s Dec 04 18:48:27 crc kubenswrapper[4948]: > Dec 04 18:48:36 crc kubenswrapper[4948]: I1204 18:48:36.625976 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:36 crc kubenswrapper[4948]: I1204 18:48:36.711198 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:36 crc kubenswrapper[4948]: I1204 18:48:36.875113 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpfmd"] Dec 04 18:48:37 crc kubenswrapper[4948]: I1204 18:48:37.780019 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vpfmd" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="registry-server" containerID="cri-o://62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc" gracePeriod=2 Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.203849 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.312641 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-utilities\") pod \"ccf0107b-0c6a-444f-924d-f43e7a4882db\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.312831 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-catalog-content\") pod \"ccf0107b-0c6a-444f-924d-f43e7a4882db\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.312884 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tsmk\" (UniqueName: \"kubernetes.io/projected/ccf0107b-0c6a-444f-924d-f43e7a4882db-kube-api-access-7tsmk\") pod \"ccf0107b-0c6a-444f-924d-f43e7a4882db\" (UID: \"ccf0107b-0c6a-444f-924d-f43e7a4882db\") " Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.313853 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-utilities" (OuterVolumeSpecName: "utilities") pod "ccf0107b-0c6a-444f-924d-f43e7a4882db" (UID: "ccf0107b-0c6a-444f-924d-f43e7a4882db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.318292 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf0107b-0c6a-444f-924d-f43e7a4882db-kube-api-access-7tsmk" (OuterVolumeSpecName: "kube-api-access-7tsmk") pod "ccf0107b-0c6a-444f-924d-f43e7a4882db" (UID: "ccf0107b-0c6a-444f-924d-f43e7a4882db"). InnerVolumeSpecName "kube-api-access-7tsmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.416331 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.416360 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tsmk\" (UniqueName: \"kubernetes.io/projected/ccf0107b-0c6a-444f-924d-f43e7a4882db-kube-api-access-7tsmk\") on node \"crc\" DevicePath \"\"" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.448794 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccf0107b-0c6a-444f-924d-f43e7a4882db" (UID: "ccf0107b-0c6a-444f-924d-f43e7a4882db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.518451 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccf0107b-0c6a-444f-924d-f43e7a4882db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.791672 4948 generic.go:334] "Generic (PLEG): container finished" podID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerID="62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc" exitCode=0 Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.791722 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpfmd" event={"ID":"ccf0107b-0c6a-444f-924d-f43e7a4882db","Type":"ContainerDied","Data":"62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc"} Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.791769 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpfmd" event={"ID":"ccf0107b-0c6a-444f-924d-f43e7a4882db","Type":"ContainerDied","Data":"247638a7426aaa90c29ddfd0a9069568ae52477ac4b176ffa6251e2ee4024cd3"} Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.791766 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpfmd" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.791785 4948 scope.go:117] "RemoveContainer" containerID="62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.821450 4948 scope.go:117] "RemoveContainer" containerID="85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.832205 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpfmd"] Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.840450 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vpfmd"] Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.847025 4948 scope.go:117] "RemoveContainer" containerID="5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.889304 4948 scope.go:117] "RemoveContainer" containerID="62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc" Dec 04 18:48:38 crc kubenswrapper[4948]: E1204 18:48:38.890353 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc\": container with ID starting with 62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc not found: ID does not exist" containerID="62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.890421 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc"} err="failed to get container status \"62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc\": rpc error: code = NotFound desc = could not find container \"62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc\": container with ID starting with 62cdc6365fa2e97061d70ab40875ea02a067fe87085c1f400044d513d2a157dc not found: ID does not exist" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.890453 4948 scope.go:117] "RemoveContainer" containerID="85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81" Dec 04 18:48:38 crc kubenswrapper[4948]: E1204 18:48:38.890819 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81\": container with ID starting with 85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81 not found: ID does not exist" containerID="85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.890850 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81"} err="failed to get container status \"85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81\": rpc error: code = NotFound desc = could not find container \"85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81\": container with ID starting with 85f2e287016f3df485a4d3e9d3862649eb1bcc89c73ec7edbae5e50cde656b81 not found: ID does not exist" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.890871 4948 scope.go:117] "RemoveContainer" containerID="5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf" Dec 04 18:48:38 crc kubenswrapper[4948]: E1204 18:48:38.891450 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf\": container with ID starting with 5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf not found: ID does not exist" containerID="5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.891501 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf"} err="failed to get container status \"5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf\": rpc error: code = NotFound desc = could not find container \"5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf\": container with ID starting with 5c68f7156dbe28c503506e4a4c6f11b48a770a2ac5b06d5ecabe5416efad2faf not found: ID does not exist" Dec 04 18:48:38 crc kubenswrapper[4948]: I1204 18:48:38.927531 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" path="/var/lib/kubelet/pods/ccf0107b-0c6a-444f-924d-f43e7a4882db/volumes" Dec 04 18:49:40 crc kubenswrapper[4948]: I1204 18:49:40.624996 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:49:40 crc kubenswrapper[4948]: I1204 18:49:40.625690 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:50:10 crc kubenswrapper[4948]: I1204 18:50:10.625419 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:50:10 crc kubenswrapper[4948]: I1204 18:50:10.626325 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:50:40 crc kubenswrapper[4948]: I1204 18:50:40.625654 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:50:40 crc kubenswrapper[4948]: I1204 18:50:40.626872 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:50:40 crc kubenswrapper[4948]: I1204 18:50:40.627119 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:50:40 crc kubenswrapper[4948]: I1204 18:50:40.628749 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:50:40 crc kubenswrapper[4948]: I1204 18:50:40.628849 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" gracePeriod=600 Dec 04 18:50:40 crc kubenswrapper[4948]: E1204 18:50:40.789464 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:50:41 crc kubenswrapper[4948]: I1204 18:50:41.003333 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" exitCode=0 Dec 04 18:50:41 crc kubenswrapper[4948]: I1204 18:50:41.003391 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250"} Dec 04 18:50:41 crc kubenswrapper[4948]: I1204 18:50:41.003528 4948 scope.go:117] "RemoveContainer" containerID="a29eaa42cdb0c16e7c8a62406f8650839b1b37e8186439675fc3f07033c143ae" Dec 04 18:50:41 crc kubenswrapper[4948]: I1204 18:50:41.004629 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:50:41 crc kubenswrapper[4948]: E1204 18:50:41.005152 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:50:52 crc kubenswrapper[4948]: I1204 18:50:52.914397 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:50:52 crc kubenswrapper[4948]: E1204 18:50:52.915569 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:51:03 crc kubenswrapper[4948]: I1204 18:51:03.914394 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:51:03 crc kubenswrapper[4948]: E1204 18:51:03.915596 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:51:16 crc kubenswrapper[4948]: I1204 18:51:16.915088 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:51:16 crc kubenswrapper[4948]: E1204 18:51:16.916290 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:51:28 crc kubenswrapper[4948]: I1204 18:51:28.925296 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:51:28 crc kubenswrapper[4948]: E1204 18:51:28.926640 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:51:40 crc kubenswrapper[4948]: I1204 18:51:40.914199 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:51:40 crc kubenswrapper[4948]: E1204 18:51:40.915409 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:51:54 crc kubenswrapper[4948]: I1204 18:51:54.914935 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:51:54 crc kubenswrapper[4948]: E1204 18:51:54.915971 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:52:09 crc kubenswrapper[4948]: I1204 18:52:09.914811 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:52:09 crc kubenswrapper[4948]: E1204 18:52:09.915865 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:52:21 crc kubenswrapper[4948]: I1204 18:52:21.914159 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:52:21 crc kubenswrapper[4948]: E1204 18:52:21.915199 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:52:33 crc kubenswrapper[4948]: I1204 18:52:33.914139 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:52:33 crc kubenswrapper[4948]: E1204 18:52:33.914897 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:52:45 crc kubenswrapper[4948]: I1204 18:52:45.914192 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:52:45 crc kubenswrapper[4948]: E1204 18:52:45.915145 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:53:00 crc kubenswrapper[4948]: I1204 18:53:00.913751 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:53:00 crc kubenswrapper[4948]: E1204 18:53:00.915313 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.122335 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tqkln"] Dec 04 18:53:14 crc kubenswrapper[4948]: E1204 18:53:14.123369 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="extract-utilities" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.123390 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="extract-utilities" Dec 04 18:53:14 crc kubenswrapper[4948]: E1204 18:53:14.123420 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="registry-server" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.123431 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="registry-server" Dec 04 18:53:14 crc kubenswrapper[4948]: E1204 18:53:14.123451 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="extract-content" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.123461 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="extract-content" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.123689 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf0107b-0c6a-444f-924d-f43e7a4882db" containerName="registry-server" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.125497 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.156710 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqkln"] Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.281020 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfj4l\" (UniqueName: \"kubernetes.io/projected/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-kube-api-access-cfj4l\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.281776 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-catalog-content\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.282518 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-utilities\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.383512 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-utilities\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.383627 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfj4l\" (UniqueName: \"kubernetes.io/projected/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-kube-api-access-cfj4l\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.383722 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-catalog-content\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.384251 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-utilities\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.384332 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-catalog-content\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.415488 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfj4l\" (UniqueName: \"kubernetes.io/projected/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-kube-api-access-cfj4l\") pod \"redhat-marketplace-tqkln\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.455141 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:14 crc kubenswrapper[4948]: I1204 18:53:14.927658 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqkln"] Dec 04 18:53:15 crc kubenswrapper[4948]: I1204 18:53:15.476882 4948 generic.go:334] "Generic (PLEG): container finished" podID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerID="2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f" exitCode=0 Dec 04 18:53:15 crc kubenswrapper[4948]: I1204 18:53:15.477130 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqkln" event={"ID":"61f56380-43a4-4d1f-88a4-bc7ac6d09f25","Type":"ContainerDied","Data":"2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f"} Dec 04 18:53:15 crc kubenswrapper[4948]: I1204 18:53:15.477400 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqkln" event={"ID":"61f56380-43a4-4d1f-88a4-bc7ac6d09f25","Type":"ContainerStarted","Data":"127f32969913c20a5913170957e915f285aeb99d8f73ac79133d16d83a6fd09c"} Dec 04 18:53:15 crc kubenswrapper[4948]: I1204 18:53:15.914147 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:53:15 crc kubenswrapper[4948]: E1204 18:53:15.914552 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:53:16 crc kubenswrapper[4948]: I1204 18:53:16.486687 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqkln" event={"ID":"61f56380-43a4-4d1f-88a4-bc7ac6d09f25","Type":"ContainerStarted","Data":"f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2"} Dec 04 18:53:17 crc kubenswrapper[4948]: I1204 18:53:17.500489 4948 generic.go:334] "Generic (PLEG): container finished" podID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerID="f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2" exitCode=0 Dec 04 18:53:17 crc kubenswrapper[4948]: I1204 18:53:17.500642 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqkln" event={"ID":"61f56380-43a4-4d1f-88a4-bc7ac6d09f25","Type":"ContainerDied","Data":"f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2"} Dec 04 18:53:18 crc kubenswrapper[4948]: I1204 18:53:18.515157 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqkln" event={"ID":"61f56380-43a4-4d1f-88a4-bc7ac6d09f25","Type":"ContainerStarted","Data":"bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c"} Dec 04 18:53:18 crc kubenswrapper[4948]: I1204 18:53:18.542872 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tqkln" podStartSLOduration=2.131186037 podStartE2EDuration="4.542852569s" podCreationTimestamp="2025-12-04 18:53:14 +0000 UTC" firstStartedPulling="2025-12-04 18:53:15.479736094 +0000 UTC m=+5206.840810536" lastFinishedPulling="2025-12-04 18:53:17.891402656 +0000 UTC m=+5209.252477068" observedRunningTime="2025-12-04 18:53:18.542114828 +0000 UTC m=+5209.903189260" watchObservedRunningTime="2025-12-04 18:53:18.542852569 +0000 UTC m=+5209.903926971" Dec 04 18:53:24 crc kubenswrapper[4948]: I1204 18:53:24.455948 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:24 crc kubenswrapper[4948]: I1204 18:53:24.456583 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:24 crc kubenswrapper[4948]: I1204 18:53:24.512650 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:24 crc kubenswrapper[4948]: I1204 18:53:24.642944 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:24 crc kubenswrapper[4948]: I1204 18:53:24.756892 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqkln"] Dec 04 18:53:26 crc kubenswrapper[4948]: I1204 18:53:26.610286 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tqkln" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerName="registry-server" containerID="cri-o://bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c" gracePeriod=2 Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.615893 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.621818 4948 generic.go:334] "Generic (PLEG): container finished" podID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerID="bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c" exitCode=0 Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.621863 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqkln" event={"ID":"61f56380-43a4-4d1f-88a4-bc7ac6d09f25","Type":"ContainerDied","Data":"bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c"} Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.621897 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tqkln" event={"ID":"61f56380-43a4-4d1f-88a4-bc7ac6d09f25","Type":"ContainerDied","Data":"127f32969913c20a5913170957e915f285aeb99d8f73ac79133d16d83a6fd09c"} Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.621918 4948 scope.go:117] "RemoveContainer" containerID="bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.621974 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tqkln" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.659304 4948 scope.go:117] "RemoveContainer" containerID="f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.682278 4948 scope.go:117] "RemoveContainer" containerID="2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.710479 4948 scope.go:117] "RemoveContainer" containerID="bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c" Dec 04 18:53:27 crc kubenswrapper[4948]: E1204 18:53:27.711207 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c\": container with ID starting with bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c not found: ID does not exist" containerID="bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.711261 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c"} err="failed to get container status \"bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c\": rpc error: code = NotFound desc = could not find container \"bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c\": container with ID starting with bec307e47b27a8d9e32db821e3ea750b33e6810df6af5b2cf309c88d6bffb94c not found: ID does not exist" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.711297 4948 scope.go:117] "RemoveContainer" containerID="f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2" Dec 04 18:53:27 crc kubenswrapper[4948]: E1204 18:53:27.711816 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2\": container with ID starting with f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2 not found: ID does not exist" containerID="f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.711882 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2"} err="failed to get container status \"f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2\": rpc error: code = NotFound desc = could not find container \"f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2\": container with ID starting with f94e126b347015e0282324537bac5a3873feb7598ae46e4ade56374c5eff98a2 not found: ID does not exist" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.711919 4948 scope.go:117] "RemoveContainer" containerID="2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f" Dec 04 18:53:27 crc kubenswrapper[4948]: E1204 18:53:27.712363 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f\": container with ID starting with 2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f not found: ID does not exist" containerID="2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.712411 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f"} err="failed to get container status \"2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f\": rpc error: code = NotFound desc = could not find container \"2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f\": container with ID starting with 2e60d1b9874993b538a7476ba4250bb815da69ef77f7afdc55c79ffc15d1888f not found: ID does not exist" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.733668 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-utilities\") pod \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.733799 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-catalog-content\") pod \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.733841 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfj4l\" (UniqueName: \"kubernetes.io/projected/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-kube-api-access-cfj4l\") pod \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\" (UID: \"61f56380-43a4-4d1f-88a4-bc7ac6d09f25\") " Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.735705 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-utilities" (OuterVolumeSpecName: "utilities") pod "61f56380-43a4-4d1f-88a4-bc7ac6d09f25" (UID: "61f56380-43a4-4d1f-88a4-bc7ac6d09f25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.739771 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-kube-api-access-cfj4l" (OuterVolumeSpecName: "kube-api-access-cfj4l") pod "61f56380-43a4-4d1f-88a4-bc7ac6d09f25" (UID: "61f56380-43a4-4d1f-88a4-bc7ac6d09f25"). InnerVolumeSpecName "kube-api-access-cfj4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.753543 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61f56380-43a4-4d1f-88a4-bc7ac6d09f25" (UID: "61f56380-43a4-4d1f-88a4-bc7ac6d09f25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.835675 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.835717 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.835734 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfj4l\" (UniqueName: \"kubernetes.io/projected/61f56380-43a4-4d1f-88a4-bc7ac6d09f25-kube-api-access-cfj4l\") on node \"crc\" DevicePath \"\"" Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.974168 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqkln"] Dec 04 18:53:27 crc kubenswrapper[4948]: I1204 18:53:27.985679 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tqkln"] Dec 04 18:53:28 crc kubenswrapper[4948]: I1204 18:53:28.932883 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" path="/var/lib/kubelet/pods/61f56380-43a4-4d1f-88a4-bc7ac6d09f25/volumes" Dec 04 18:53:29 crc kubenswrapper[4948]: I1204 18:53:29.914483 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:53:29 crc kubenswrapper[4948]: E1204 18:53:29.915039 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.220655 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hrbmp"] Dec 04 18:53:35 crc kubenswrapper[4948]: E1204 18:53:35.221988 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerName="extract-utilities" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.222012 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerName="extract-utilities" Dec 04 18:53:35 crc kubenswrapper[4948]: E1204 18:53:35.222110 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerName="registry-server" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.222129 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerName="registry-server" Dec 04 18:53:35 crc kubenswrapper[4948]: E1204 18:53:35.222154 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerName="extract-content" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.222171 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerName="extract-content" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.222440 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f56380-43a4-4d1f-88a4-bc7ac6d09f25" containerName="registry-server" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.224452 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.228360 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrbmp"] Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.378573 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-utilities\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.378832 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dm84\" (UniqueName: \"kubernetes.io/projected/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-kube-api-access-2dm84\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.379031 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-catalog-content\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.480750 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-utilities\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.480785 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dm84\" (UniqueName: \"kubernetes.io/projected/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-kube-api-access-2dm84\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.480849 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-catalog-content\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.481301 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-catalog-content\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.481715 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-utilities\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.507353 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dm84\" (UniqueName: \"kubernetes.io/projected/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-kube-api-access-2dm84\") pod \"certified-operators-hrbmp\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:35 crc kubenswrapper[4948]: I1204 18:53:35.558007 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:36 crc kubenswrapper[4948]: I1204 18:53:36.061789 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrbmp"] Dec 04 18:53:36 crc kubenswrapper[4948]: I1204 18:53:36.720776 4948 generic.go:334] "Generic (PLEG): container finished" podID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerID="e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f" exitCode=0 Dec 04 18:53:36 crc kubenswrapper[4948]: I1204 18:53:36.720856 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrbmp" event={"ID":"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea","Type":"ContainerDied","Data":"e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f"} Dec 04 18:53:36 crc kubenswrapper[4948]: I1204 18:53:36.720905 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrbmp" event={"ID":"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea","Type":"ContainerStarted","Data":"b3ef531e05841731a829412e51af9253ce6c0f5d208650c6b8330ea0c20b68cb"} Dec 04 18:53:36 crc kubenswrapper[4948]: I1204 18:53:36.723020 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 18:53:37 crc kubenswrapper[4948]: I1204 18:53:37.731810 4948 generic.go:334] "Generic (PLEG): container finished" podID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerID="57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42" exitCode=0 Dec 04 18:53:37 crc kubenswrapper[4948]: I1204 18:53:37.731910 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrbmp" event={"ID":"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea","Type":"ContainerDied","Data":"57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42"} Dec 04 18:53:38 crc kubenswrapper[4948]: I1204 18:53:38.746280 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrbmp" event={"ID":"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea","Type":"ContainerStarted","Data":"f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad"} Dec 04 18:53:38 crc kubenswrapper[4948]: I1204 18:53:38.780883 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hrbmp" podStartSLOduration=2.259647492 podStartE2EDuration="3.780851274s" podCreationTimestamp="2025-12-04 18:53:35 +0000 UTC" firstStartedPulling="2025-12-04 18:53:36.722516296 +0000 UTC m=+5228.083590728" lastFinishedPulling="2025-12-04 18:53:38.243720078 +0000 UTC m=+5229.604794510" observedRunningTime="2025-12-04 18:53:38.77688052 +0000 UTC m=+5230.137954962" watchObservedRunningTime="2025-12-04 18:53:38.780851274 +0000 UTC m=+5230.141925716" Dec 04 18:53:41 crc kubenswrapper[4948]: I1204 18:53:41.913604 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:53:41 crc kubenswrapper[4948]: E1204 18:53:41.914348 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:53:45 crc kubenswrapper[4948]: I1204 18:53:45.558677 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:45 crc kubenswrapper[4948]: I1204 18:53:45.559193 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:45 crc kubenswrapper[4948]: I1204 18:53:45.607072 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:45 crc kubenswrapper[4948]: I1204 18:53:45.868280 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:45 crc kubenswrapper[4948]: I1204 18:53:45.918341 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrbmp"] Dec 04 18:53:47 crc kubenswrapper[4948]: I1204 18:53:47.836695 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hrbmp" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerName="registry-server" containerID="cri-o://f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad" gracePeriod=2 Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.351166 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.517597 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dm84\" (UniqueName: \"kubernetes.io/projected/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-kube-api-access-2dm84\") pod \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.517725 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-catalog-content\") pod \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.517841 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-utilities\") pod \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\" (UID: \"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea\") " Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.519536 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-utilities" (OuterVolumeSpecName: "utilities") pod "779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" (UID: "779c0cbf-c7cc-448b-ad1e-5955ddcd05ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.526255 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-kube-api-access-2dm84" (OuterVolumeSpecName: "kube-api-access-2dm84") pod "779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" (UID: "779c0cbf-c7cc-448b-ad1e-5955ddcd05ea"). InnerVolumeSpecName "kube-api-access-2dm84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.576251 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" (UID: "779c0cbf-c7cc-448b-ad1e-5955ddcd05ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.621351 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dm84\" (UniqueName: \"kubernetes.io/projected/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-kube-api-access-2dm84\") on node \"crc\" DevicePath \"\"" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.621402 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.621419 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.862329 4948 generic.go:334] "Generic (PLEG): container finished" podID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerID="f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad" exitCode=0 Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.862404 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrbmp" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.862390 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrbmp" event={"ID":"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea","Type":"ContainerDied","Data":"f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad"} Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.862500 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrbmp" event={"ID":"779c0cbf-c7cc-448b-ad1e-5955ddcd05ea","Type":"ContainerDied","Data":"b3ef531e05841731a829412e51af9253ce6c0f5d208650c6b8330ea0c20b68cb"} Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.862542 4948 scope.go:117] "RemoveContainer" containerID="f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.902701 4948 scope.go:117] "RemoveContainer" containerID="57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.932354 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrbmp"] Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.938946 4948 scope.go:117] "RemoveContainer" containerID="e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.946713 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hrbmp"] Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.970516 4948 scope.go:117] "RemoveContainer" containerID="f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad" Dec 04 18:53:49 crc kubenswrapper[4948]: E1204 18:53:49.970998 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad\": container with ID starting with f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad not found: ID does not exist" containerID="f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.971035 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad"} err="failed to get container status \"f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad\": rpc error: code = NotFound desc = could not find container \"f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad\": container with ID starting with f3c7a69735109d6dda209dab7c272d9c684c9110e89b11d1569a5c72e89622ad not found: ID does not exist" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.971080 4948 scope.go:117] "RemoveContainer" containerID="57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42" Dec 04 18:53:49 crc kubenswrapper[4948]: E1204 18:53:49.971615 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42\": container with ID starting with 57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42 not found: ID does not exist" containerID="57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.971648 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42"} err="failed to get container status \"57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42\": rpc error: code = NotFound desc = could not find container \"57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42\": container with ID starting with 57e200b5c527a0d269be15b282a925fd9199cd05f39334c3ca6f85005b948d42 not found: ID does not exist" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.971665 4948 scope.go:117] "RemoveContainer" containerID="e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f" Dec 04 18:53:49 crc kubenswrapper[4948]: E1204 18:53:49.971941 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f\": container with ID starting with e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f not found: ID does not exist" containerID="e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f" Dec 04 18:53:49 crc kubenswrapper[4948]: I1204 18:53:49.971964 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f"} err="failed to get container status \"e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f\": rpc error: code = NotFound desc = could not find container \"e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f\": container with ID starting with e1c2053910633dd2378eeda0e98f13b8f4fd9c464161ccfdc2935eac0b563c0f not found: ID does not exist" Dec 04 18:53:50 crc kubenswrapper[4948]: I1204 18:53:50.930530 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" path="/var/lib/kubelet/pods/779c0cbf-c7cc-448b-ad1e-5955ddcd05ea/volumes" Dec 04 18:53:53 crc kubenswrapper[4948]: I1204 18:53:53.914172 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:53:53 crc kubenswrapper[4948]: E1204 18:53:53.914605 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:54:07 crc kubenswrapper[4948]: I1204 18:54:07.914022 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:54:07 crc kubenswrapper[4948]: E1204 18:54:07.915525 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.475150 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kzqmq/must-gather-dmrx2"] Dec 04 18:54:13 crc kubenswrapper[4948]: E1204 18:54:13.475987 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerName="extract-utilities" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.476001 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerName="extract-utilities" Dec 04 18:54:13 crc kubenswrapper[4948]: E1204 18:54:13.476016 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerName="extract-content" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.476022 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerName="extract-content" Dec 04 18:54:13 crc kubenswrapper[4948]: E1204 18:54:13.476033 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerName="registry-server" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.476056 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerName="registry-server" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.476193 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="779c0cbf-c7cc-448b-ad1e-5955ddcd05ea" containerName="registry-server" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.476976 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.479776 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kzqmq"/"default-dockercfg-kjxw5" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.479980 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kzqmq"/"kube-root-ca.crt" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.480368 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kzqmq"/"openshift-service-ca.crt" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.485726 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kzqmq/must-gather-dmrx2"] Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.627455 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v6mr\" (UniqueName: \"kubernetes.io/projected/e5c589da-86af-47b6-814b-19e03d691909-kube-api-access-2v6mr\") pod \"must-gather-dmrx2\" (UID: \"e5c589da-86af-47b6-814b-19e03d691909\") " pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.627647 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e5c589da-86af-47b6-814b-19e03d691909-must-gather-output\") pod \"must-gather-dmrx2\" (UID: \"e5c589da-86af-47b6-814b-19e03d691909\") " pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.728709 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v6mr\" (UniqueName: \"kubernetes.io/projected/e5c589da-86af-47b6-814b-19e03d691909-kube-api-access-2v6mr\") pod \"must-gather-dmrx2\" (UID: \"e5c589da-86af-47b6-814b-19e03d691909\") " pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.728820 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e5c589da-86af-47b6-814b-19e03d691909-must-gather-output\") pod \"must-gather-dmrx2\" (UID: \"e5c589da-86af-47b6-814b-19e03d691909\") " pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.729253 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e5c589da-86af-47b6-814b-19e03d691909-must-gather-output\") pod \"must-gather-dmrx2\" (UID: \"e5c589da-86af-47b6-814b-19e03d691909\") " pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.750687 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v6mr\" (UniqueName: \"kubernetes.io/projected/e5c589da-86af-47b6-814b-19e03d691909-kube-api-access-2v6mr\") pod \"must-gather-dmrx2\" (UID: \"e5c589da-86af-47b6-814b-19e03d691909\") " pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:54:13 crc kubenswrapper[4948]: I1204 18:54:13.793005 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:54:14 crc kubenswrapper[4948]: I1204 18:54:14.034749 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kzqmq/must-gather-dmrx2"] Dec 04 18:54:14 crc kubenswrapper[4948]: I1204 18:54:14.106989 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" event={"ID":"e5c589da-86af-47b6-814b-19e03d691909","Type":"ContainerStarted","Data":"e46c093acc73e2db12d9a0cc672feb3feee2b674489553a56f66f2c5a354cb7a"} Dec 04 18:54:19 crc kubenswrapper[4948]: I1204 18:54:19.150356 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" event={"ID":"e5c589da-86af-47b6-814b-19e03d691909","Type":"ContainerStarted","Data":"76c8ac84c6df3eb3194ca9311304b20aad62a449d822074a134ed0731368d01e"} Dec 04 18:54:19 crc kubenswrapper[4948]: I1204 18:54:19.150769 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" event={"ID":"e5c589da-86af-47b6-814b-19e03d691909","Type":"ContainerStarted","Data":"1d16cd402f6bf5619034ac4e3f23ea3173787dccbf587c733c9dcf2ace75143c"} Dec 04 18:54:19 crc kubenswrapper[4948]: I1204 18:54:19.175991 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" podStartSLOduration=2.319519342 podStartE2EDuration="6.175968793s" podCreationTimestamp="2025-12-04 18:54:13 +0000 UTC" firstStartedPulling="2025-12-04 18:54:14.04965284 +0000 UTC m=+5265.410727242" lastFinishedPulling="2025-12-04 18:54:17.906102261 +0000 UTC m=+5269.267176693" observedRunningTime="2025-12-04 18:54:19.168001024 +0000 UTC m=+5270.529075436" watchObservedRunningTime="2025-12-04 18:54:19.175968793 +0000 UTC m=+5270.537043205" Dec 04 18:54:21 crc kubenswrapper[4948]: I1204 18:54:21.914256 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:54:21 crc kubenswrapper[4948]: E1204 18:54:21.914861 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:54:33 crc kubenswrapper[4948]: I1204 18:54:33.914733 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:54:33 crc kubenswrapper[4948]: E1204 18:54:33.915508 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:54:47 crc kubenswrapper[4948]: I1204 18:54:47.914096 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:54:47 crc kubenswrapper[4948]: E1204 18:54:47.915141 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:54:59 crc kubenswrapper[4948]: I1204 18:54:59.913814 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:54:59 crc kubenswrapper[4948]: E1204 18:54:59.914559 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:55:13 crc kubenswrapper[4948]: I1204 18:55:13.913760 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:55:13 crc kubenswrapper[4948]: E1204 18:55:13.914775 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:55:16 crc kubenswrapper[4948]: I1204 18:55:16.881609 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9_2195ca22-748e-4ba3-9524-891a74e7440e/util/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.028242 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9_2195ca22-748e-4ba3-9524-891a74e7440e/util/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.068587 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9_2195ca22-748e-4ba3-9524-891a74e7440e/pull/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.121606 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9_2195ca22-748e-4ba3-9524-891a74e7440e/pull/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.242110 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9_2195ca22-748e-4ba3-9524-891a74e7440e/pull/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.249410 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9_2195ca22-748e-4ba3-9524-891a74e7440e/extract/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.285187 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_69e8e537e04f4de464a062ccb6541f4e07f967b09dc2ef87a0d14bac4ahlhh9_2195ca22-748e-4ba3-9524-891a74e7440e/util/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.431941 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-xnn4n_4fde973d-7944-478b-a53d-6cbfdbce85e6/kube-rbac-proxy/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.494830 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-xnn4n_4fde973d-7944-478b-a53d-6cbfdbce85e6/manager/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.563873 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-4lgxs_332ef640-0de7-423e-a7d0-39637d3b4ada/kube-rbac-proxy/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.682433 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-4lgxs_332ef640-0de7-423e-a7d0-39637d3b4ada/manager/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.688113 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-gb89j_3c59027a-d806-4798-8338-a2ea5c9ba1ba/kube-rbac-proxy/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.782864 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-gb89j_3c59027a-d806-4798-8338-a2ea5c9ba1ba/manager/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.851181 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-hvw8z_4bb211de-d340-4f3d-999f-d0759663fc73/kube-rbac-proxy/0.log" Dec 04 18:55:17 crc kubenswrapper[4948]: I1204 18:55:17.980330 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-hvw8z_4bb211de-d340-4f3d-999f-d0759663fc73/manager/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.022239 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-ff5cf_cb34830b-da24-4d66-b3ca-136506c4ef7b/kube-rbac-proxy/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.085698 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-ff5cf_cb34830b-da24-4d66-b3ca-136506c4ef7b/manager/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.180914 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-2wn6p_8000563f-fa21-4755-8434-fc5c4e25cd99/kube-rbac-proxy/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.306717 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-2wn6p_8000563f-fa21-4755-8434-fc5c4e25cd99/manager/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.411678 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-k8zrd_03cc6bc7-10ac-4521-9688-bbff0633f05a/kube-rbac-proxy/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.509491 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-k8zrd_03cc6bc7-10ac-4521-9688-bbff0633f05a/manager/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.521778 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-pxvwh_0740d32c-babe-4471-9f15-211080e05cbb/kube-rbac-proxy/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.831074 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-pxvwh_0740d32c-babe-4471-9f15-211080e05cbb/manager/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.900674 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-p7mxr_98b7f9ee-20c5-4821-9841-c44b60650d4e/kube-rbac-proxy/0.log" Dec 04 18:55:18 crc kubenswrapper[4948]: I1204 18:55:18.975763 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-p7mxr_98b7f9ee-20c5-4821-9841-c44b60650d4e/manager/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.102607 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-pvwlh_d67b2a10-118c-4a3a-8cc8-a5dc33a92896/kube-rbac-proxy/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.144517 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-pvwlh_d67b2a10-118c-4a3a-8cc8-a5dc33a92896/manager/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.234249 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-tx6s4_c9f08351-bf2d-4272-a43e-c8770c413a7c/kube-rbac-proxy/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.343827 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-tx6s4_c9f08351-bf2d-4272-a43e-c8770c413a7c/manager/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.385434 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-89ndh_d228e40b-4c01-4794-a80e-7b77ec37ba2b/kube-rbac-proxy/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.464705 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-89ndh_d228e40b-4c01-4794-a80e-7b77ec37ba2b/manager/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.551152 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p5t6c_f621ca2b-bd4b-41a2-b11a-985f094886b1/kube-rbac-proxy/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.615520 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p5t6c_f621ca2b-bd4b-41a2-b11a-985f094886b1/manager/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.713070 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-5g87l_bf2472b9-2441-4c7b-9d50-928f8dc38c78/kube-rbac-proxy/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.725018 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-5g87l_bf2472b9-2441-4c7b-9d50-928f8dc38c78/manager/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.853796 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp_0ef37c1f-0fdf-43bd-81cf-4a359b671653/kube-rbac-proxy/0.log" Dec 04 18:55:19 crc kubenswrapper[4948]: I1204 18:55:19.902679 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4lxmfp_0ef37c1f-0fdf-43bd-81cf-4a359b671653/manager/0.log" Dec 04 18:55:20 crc kubenswrapper[4948]: I1204 18:55:20.218589 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-27h4f_6b7d9d62-89d9-47b1-8757-7c6da0fe06db/registry-server/0.log" Dec 04 18:55:20 crc kubenswrapper[4948]: I1204 18:55:20.301198 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-66bcc8f984-mrlv5_3ec94845-de1c-4ed1-b588-7cbe115fb1d7/operator/0.log" Dec 04 18:55:20 crc kubenswrapper[4948]: I1204 18:55:20.338191 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-jlxw2_a118aaa1-bd32-4cbb-bc4b-6561faeca58b/kube-rbac-proxy/0.log" Dec 04 18:55:20 crc kubenswrapper[4948]: I1204 18:55:20.551205 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-2hntl_cb78bf36-4988-4814-b6a5-cf5c869eaee6/kube-rbac-proxy/0.log" Dec 04 18:55:20 crc kubenswrapper[4948]: I1204 18:55:20.717377 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-f65bcfbd6-ksq9b_e1c25561-350e-4093-8f84-17a631b22d36/manager/0.log" Dec 04 18:55:20 crc kubenswrapper[4948]: I1204 18:55:20.831989 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-2hntl_cb78bf36-4988-4814-b6a5-cf5c869eaee6/manager/0.log" Dec 04 18:55:20 crc kubenswrapper[4948]: I1204 18:55:20.877843 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-jlxw2_a118aaa1-bd32-4cbb-bc4b-6561faeca58b/manager/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.016657 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wgqdn_80927c44-1bff-48a7-8f3a-25ca44033176/operator/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.044162 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-zqk8k_e1097cb0-78f6-49a6-87d2-4aa88fb31f58/kube-rbac-proxy/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.101686 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-zqk8k_e1097cb0-78f6-49a6-87d2-4aa88fb31f58/manager/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.246969 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-9dfcj_07646f24-8e16-4202-b2f9-ac13a751235e/kube-rbac-proxy/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.292925 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-9dfcj_07646f24-8e16-4202-b2f9-ac13a751235e/manager/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.336489 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fvll5_f5180167-c92a-4f8a-b924-ee9d6e080261/kube-rbac-proxy/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.413134 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fvll5_f5180167-c92a-4f8a-b924-ee9d6e080261/manager/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.493439 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-wrj57_bd023a53-5fe4-4660-aee2-c8565808da2f/kube-rbac-proxy/0.log" Dec 04 18:55:21 crc kubenswrapper[4948]: I1204 18:55:21.510286 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-wrj57_bd023a53-5fe4-4660-aee2-c8565808da2f/manager/0.log" Dec 04 18:55:24 crc kubenswrapper[4948]: I1204 18:55:24.914487 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:55:24 crc kubenswrapper[4948]: E1204 18:55:24.914906 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:55:39 crc kubenswrapper[4948]: I1204 18:55:39.913962 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:55:39 crc kubenswrapper[4948]: E1204 18:55:39.914621 4948 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hfvn4_openshift-machine-config-operator(9c5bb3e4-2f5a-47d7-a998-be50d1429cb2)\"" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" Dec 04 18:55:40 crc kubenswrapper[4948]: I1204 18:55:40.068061 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hg692_4005cfa7-7eda-43d9-ba7f-fe06d42c82d2/control-plane-machine-set-operator/0.log" Dec 04 18:55:40 crc kubenswrapper[4948]: I1204 18:55:40.257393 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xqmtt_1fb6542e-ebb3-4df7-95d3-7c6c55fcd845/kube-rbac-proxy/0.log" Dec 04 18:55:40 crc kubenswrapper[4948]: I1204 18:55:40.276833 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xqmtt_1fb6542e-ebb3-4df7-95d3-7c6c55fcd845/machine-api-operator/0.log" Dec 04 18:55:50 crc kubenswrapper[4948]: I1204 18:55:50.913238 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 18:55:51 crc kubenswrapper[4948]: I1204 18:55:51.904497 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"23d99bd304150fa77813e5ebbfd3e6f62f8172b107cd68932012eabeaf18a1f2"} Dec 04 18:55:53 crc kubenswrapper[4948]: I1204 18:55:53.404606 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-n2t4w_e507144d-9fd7-420e-881a-99aee0044dd4/cert-manager-controller/0.log" Dec 04 18:55:53 crc kubenswrapper[4948]: I1204 18:55:53.563205 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-xwz2z_1085add4-ec56-476e-8816-81284e3676e4/cert-manager-cainjector/0.log" Dec 04 18:55:53 crc kubenswrapper[4948]: I1204 18:55:53.622629 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-rl477_98a09c31-f983-44e5-8454-39df52726e91/cert-manager-webhook/0.log" Dec 04 18:56:07 crc kubenswrapper[4948]: I1204 18:56:07.226639 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-bwmvw_08e03b38-421e-4e93-a695-6e090536ecae/nmstate-console-plugin/0.log" Dec 04 18:56:07 crc kubenswrapper[4948]: I1204 18:56:07.382092 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8tjwj_b9b89e09-1bdf-47ea-a59a-fa532aae1589/nmstate-handler/0.log" Dec 04 18:56:07 crc kubenswrapper[4948]: I1204 18:56:07.384549 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-sjx9c_7781d778-2308-4cbd-aaa0-73588dc5f945/kube-rbac-proxy/0.log" Dec 04 18:56:07 crc kubenswrapper[4948]: I1204 18:56:07.413755 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-sjx9c_7781d778-2308-4cbd-aaa0-73588dc5f945/nmstate-metrics/0.log" Dec 04 18:56:07 crc kubenswrapper[4948]: I1204 18:56:07.612007 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-r99vf_16777efc-db01-42ba-8a24-e966989fb402/nmstate-operator/0.log" Dec 04 18:56:07 crc kubenswrapper[4948]: I1204 18:56:07.665208 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-qxx5m_c48807df-003d-4f1b-8819-94dc6017e382/nmstate-webhook/0.log" Dec 04 18:56:22 crc kubenswrapper[4948]: I1204 18:56:22.979966 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-4mcsv_80cdeb47-3878-4100-bf8b-bed7e8df3c74/kube-rbac-proxy/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.207764 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-frr-files/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.386073 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-frr-files/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.387611 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-reloader/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.396221 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-4mcsv_80cdeb47-3878-4100-bf8b-bed7e8df3c74/controller/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.416547 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-metrics/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.533730 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-reloader/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.714552 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-metrics/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.716504 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-reloader/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.730196 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-frr-files/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.766536 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-metrics/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.924114 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-frr-files/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.954009 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-reloader/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.957438 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/cp-metrics/0.log" Dec 04 18:56:23 crc kubenswrapper[4948]: I1204 18:56:23.969238 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/controller/0.log" Dec 04 18:56:24 crc kubenswrapper[4948]: I1204 18:56:24.142671 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/frr-metrics/0.log" Dec 04 18:56:24 crc kubenswrapper[4948]: I1204 18:56:24.201245 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/kube-rbac-proxy-frr/0.log" Dec 04 18:56:24 crc kubenswrapper[4948]: I1204 18:56:24.216765 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/kube-rbac-proxy/0.log" Dec 04 18:56:24 crc kubenswrapper[4948]: I1204 18:56:24.366771 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/reloader/0.log" Dec 04 18:56:24 crc kubenswrapper[4948]: I1204 18:56:24.415884 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-tmtnb_0ca06785-e8b8-43ba-919d-2a00e88b9092/frr-k8s-webhook-server/0.log" Dec 04 18:56:24 crc kubenswrapper[4948]: I1204 18:56:24.649495 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-769d6bffcb-ptk7q_7372c6f5-6966-4cdf-a798-514c25eb08c3/manager/0.log" Dec 04 18:56:24 crc kubenswrapper[4948]: I1204 18:56:24.798404 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-54cdb97cf7-jfhjs_9ec6eeb9-ca17-49dc-bec3-c956b8c63c60/webhook-server/0.log" Dec 04 18:56:24 crc kubenswrapper[4948]: I1204 18:56:24.870802 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9jvll_6412088a-587d-4cf6-b85a-087535dc9378/kube-rbac-proxy/0.log" Dec 04 18:56:25 crc kubenswrapper[4948]: I1204 18:56:25.384471 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7v8rn_0b72e58a-51a9-49bc-b31e-ce04b0daf651/frr/0.log" Dec 04 18:56:25 crc kubenswrapper[4948]: I1204 18:56:25.409005 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-9jvll_6412088a-587d-4cf6-b85a-087535dc9378/speaker/0.log" Dec 04 18:56:38 crc kubenswrapper[4948]: I1204 18:56:38.731410 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq_bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd/util/0.log" Dec 04 18:56:38 crc kubenswrapper[4948]: I1204 18:56:38.908712 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq_bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd/util/0.log" Dec 04 18:56:38 crc kubenswrapper[4948]: I1204 18:56:38.967295 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq_bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd/pull/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.017060 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq_bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd/pull/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.203256 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq_bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd/extract/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.207293 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq_bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd/util/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.242195 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931anqggq_bca39b4b-a6e8-4ac1-b2c6-a4217e3c30cd/pull/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.363829 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq_b34a0876-8eb9-4701-8597-f9020c25f2d8/util/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.546561 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq_b34a0876-8eb9-4701-8597-f9020c25f2d8/pull/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.566107 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq_b34a0876-8eb9-4701-8597-f9020c25f2d8/pull/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.580685 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq_b34a0876-8eb9-4701-8597-f9020c25f2d8/util/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.739015 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq_b34a0876-8eb9-4701-8597-f9020c25f2d8/util/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.759236 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq_b34a0876-8eb9-4701-8597-f9020c25f2d8/pull/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.796523 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fg8gsq_b34a0876-8eb9-4701-8597-f9020c25f2d8/extract/0.log" Dec 04 18:56:39 crc kubenswrapper[4948]: I1204 18:56:39.892261 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28_8cba0165-dc0e-450d-b958-eb2d861e3b15/util/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.092248 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28_8cba0165-dc0e-450d-b958-eb2d861e3b15/pull/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.096912 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28_8cba0165-dc0e-450d-b958-eb2d861e3b15/util/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.109258 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28_8cba0165-dc0e-450d-b958-eb2d861e3b15/pull/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.299403 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28_8cba0165-dc0e-450d-b958-eb2d861e3b15/extract/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.311912 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28_8cba0165-dc0e-450d-b958-eb2d861e3b15/util/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.321949 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p6c28_8cba0165-dc0e-450d-b958-eb2d861e3b15/pull/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.492416 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8wmds_98f8ed4c-aff2-4925-bd20-9c87ae114c9c/extract-utilities/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.819651 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8wmds_98f8ed4c-aff2-4925-bd20-9c87ae114c9c/extract-content/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.829244 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8wmds_98f8ed4c-aff2-4925-bd20-9c87ae114c9c/extract-content/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.868901 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8wmds_98f8ed4c-aff2-4925-bd20-9c87ae114c9c/extract-utilities/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.922901 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8wmds_98f8ed4c-aff2-4925-bd20-9c87ae114c9c/extract-utilities/0.log" Dec 04 18:56:40 crc kubenswrapper[4948]: I1204 18:56:40.992709 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8wmds_98f8ed4c-aff2-4925-bd20-9c87ae114c9c/extract-content/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.174556 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k2vf9_0293d2b5-dfcf-4589-8464-8f4f1616bd5d/extract-utilities/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.225980 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8wmds_98f8ed4c-aff2-4925-bd20-9c87ae114c9c/registry-server/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.299231 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k2vf9_0293d2b5-dfcf-4589-8464-8f4f1616bd5d/extract-content/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.315184 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k2vf9_0293d2b5-dfcf-4589-8464-8f4f1616bd5d/extract-utilities/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.361675 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k2vf9_0293d2b5-dfcf-4589-8464-8f4f1616bd5d/extract-content/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.520825 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k2vf9_0293d2b5-dfcf-4589-8464-8f4f1616bd5d/extract-content/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.521069 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k2vf9_0293d2b5-dfcf-4589-8464-8f4f1616bd5d/extract-utilities/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.747685 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lqjx9_8a118304-78c9-447c-84af-57843d1f901d/marketplace-operator/0.log" Dec 04 18:56:41 crc kubenswrapper[4948]: I1204 18:56:41.787027 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ps6x2_73953573-1a1a-49b7-a686-2056cf6e6937/extract-utilities/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.017114 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ps6x2_73953573-1a1a-49b7-a686-2056cf6e6937/extract-utilities/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.064554 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ps6x2_73953573-1a1a-49b7-a686-2056cf6e6937/extract-content/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.104755 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ps6x2_73953573-1a1a-49b7-a686-2056cf6e6937/extract-content/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.276535 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ps6x2_73953573-1a1a-49b7-a686-2056cf6e6937/extract-content/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.281949 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k2vf9_0293d2b5-dfcf-4589-8464-8f4f1616bd5d/registry-server/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.291095 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ps6x2_73953573-1a1a-49b7-a686-2056cf6e6937/extract-utilities/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.480604 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9frz6_b6a62a2c-ea79-4cc5-ac96-0b485cda907c/extract-utilities/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.498648 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-ps6x2_73953573-1a1a-49b7-a686-2056cf6e6937/registry-server/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.608365 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9frz6_b6a62a2c-ea79-4cc5-ac96-0b485cda907c/extract-content/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.613495 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9frz6_b6a62a2c-ea79-4cc5-ac96-0b485cda907c/extract-utilities/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.620296 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9frz6_b6a62a2c-ea79-4cc5-ac96-0b485cda907c/extract-content/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.760624 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9frz6_b6a62a2c-ea79-4cc5-ac96-0b485cda907c/extract-utilities/0.log" Dec 04 18:56:42 crc kubenswrapper[4948]: I1204 18:56:42.815001 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9frz6_b6a62a2c-ea79-4cc5-ac96-0b485cda907c/extract-content/0.log" Dec 04 18:56:43 crc kubenswrapper[4948]: I1204 18:56:43.435718 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9frz6_b6a62a2c-ea79-4cc5-ac96-0b485cda907c/registry-server/0.log" Dec 04 18:57:47 crc kubenswrapper[4948]: I1204 18:57:47.850399 4948 generic.go:334] "Generic (PLEG): container finished" podID="e5c589da-86af-47b6-814b-19e03d691909" containerID="1d16cd402f6bf5619034ac4e3f23ea3173787dccbf587c733c9dcf2ace75143c" exitCode=0 Dec 04 18:57:47 crc kubenswrapper[4948]: I1204 18:57:47.850622 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" event={"ID":"e5c589da-86af-47b6-814b-19e03d691909","Type":"ContainerDied","Data":"1d16cd402f6bf5619034ac4e3f23ea3173787dccbf587c733c9dcf2ace75143c"} Dec 04 18:57:47 crc kubenswrapper[4948]: I1204 18:57:47.852096 4948 scope.go:117] "RemoveContainer" containerID="1d16cd402f6bf5619034ac4e3f23ea3173787dccbf587c733c9dcf2ace75143c" Dec 04 18:57:48 crc kubenswrapper[4948]: I1204 18:57:48.232523 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzqmq_must-gather-dmrx2_e5c589da-86af-47b6-814b-19e03d691909/gather/0.log" Dec 04 18:57:55 crc kubenswrapper[4948]: I1204 18:57:55.748826 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kzqmq/must-gather-dmrx2"] Dec 04 18:57:55 crc kubenswrapper[4948]: I1204 18:57:55.750303 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" podUID="e5c589da-86af-47b6-814b-19e03d691909" containerName="copy" containerID="cri-o://76c8ac84c6df3eb3194ca9311304b20aad62a449d822074a134ed0731368d01e" gracePeriod=2 Dec 04 18:57:55 crc kubenswrapper[4948]: I1204 18:57:55.756697 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kzqmq/must-gather-dmrx2"] Dec 04 18:57:55 crc kubenswrapper[4948]: I1204 18:57:55.917415 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzqmq_must-gather-dmrx2_e5c589da-86af-47b6-814b-19e03d691909/copy/0.log" Dec 04 18:57:55 crc kubenswrapper[4948]: I1204 18:57:55.918196 4948 generic.go:334] "Generic (PLEG): container finished" podID="e5c589da-86af-47b6-814b-19e03d691909" containerID="76c8ac84c6df3eb3194ca9311304b20aad62a449d822074a134ed0731368d01e" exitCode=143 Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.174786 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzqmq_must-gather-dmrx2_e5c589da-86af-47b6-814b-19e03d691909/copy/0.log" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.175359 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.313848 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e5c589da-86af-47b6-814b-19e03d691909-must-gather-output\") pod \"e5c589da-86af-47b6-814b-19e03d691909\" (UID: \"e5c589da-86af-47b6-814b-19e03d691909\") " Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.313949 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v6mr\" (UniqueName: \"kubernetes.io/projected/e5c589da-86af-47b6-814b-19e03d691909-kube-api-access-2v6mr\") pod \"e5c589da-86af-47b6-814b-19e03d691909\" (UID: \"e5c589da-86af-47b6-814b-19e03d691909\") " Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.320538 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c589da-86af-47b6-814b-19e03d691909-kube-api-access-2v6mr" (OuterVolumeSpecName: "kube-api-access-2v6mr") pod "e5c589da-86af-47b6-814b-19e03d691909" (UID: "e5c589da-86af-47b6-814b-19e03d691909"). InnerVolumeSpecName "kube-api-access-2v6mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.405671 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5c589da-86af-47b6-814b-19e03d691909-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e5c589da-86af-47b6-814b-19e03d691909" (UID: "e5c589da-86af-47b6-814b-19e03d691909"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.415991 4948 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e5c589da-86af-47b6-814b-19e03d691909-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.416026 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v6mr\" (UniqueName: \"kubernetes.io/projected/e5c589da-86af-47b6-814b-19e03d691909-kube-api-access-2v6mr\") on node \"crc\" DevicePath \"\"" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.933985 4948 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzqmq_must-gather-dmrx2_e5c589da-86af-47b6-814b-19e03d691909/copy/0.log" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.934998 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzqmq/must-gather-dmrx2" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.937007 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c589da-86af-47b6-814b-19e03d691909" path="/var/lib/kubelet/pods/e5c589da-86af-47b6-814b-19e03d691909/volumes" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.938373 4948 scope.go:117] "RemoveContainer" containerID="76c8ac84c6df3eb3194ca9311304b20aad62a449d822074a134ed0731368d01e" Dec 04 18:57:56 crc kubenswrapper[4948]: I1204 18:57:56.979947 4948 scope.go:117] "RemoveContainer" containerID="1d16cd402f6bf5619034ac4e3f23ea3173787dccbf587c733c9dcf2ace75143c" Dec 04 18:58:10 crc kubenswrapper[4948]: I1204 18:58:10.624790 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:58:10 crc kubenswrapper[4948]: I1204 18:58:10.626343 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.124967 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d92kf"] Dec 04 18:58:28 crc kubenswrapper[4948]: E1204 18:58:28.126186 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c589da-86af-47b6-814b-19e03d691909" containerName="gather" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.126211 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c589da-86af-47b6-814b-19e03d691909" containerName="gather" Dec 04 18:58:28 crc kubenswrapper[4948]: E1204 18:58:28.126257 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c589da-86af-47b6-814b-19e03d691909" containerName="copy" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.126270 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c589da-86af-47b6-814b-19e03d691909" containerName="copy" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.126550 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c589da-86af-47b6-814b-19e03d691909" containerName="copy" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.126594 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c589da-86af-47b6-814b-19e03d691909" containerName="gather" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.128455 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.149752 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d92kf"] Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.248403 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-catalog-content\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.248477 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-utilities\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.248683 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwrwx\" (UniqueName: \"kubernetes.io/projected/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-kube-api-access-hwrwx\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.350039 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-utilities\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.350240 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwrwx\" (UniqueName: \"kubernetes.io/projected/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-kube-api-access-hwrwx\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.350296 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-catalog-content\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.350728 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-utilities\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.350897 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-catalog-content\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.370941 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwrwx\" (UniqueName: \"kubernetes.io/projected/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-kube-api-access-hwrwx\") pod \"redhat-operators-d92kf\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.452157 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:28 crc kubenswrapper[4948]: I1204 18:58:28.698228 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d92kf"] Dec 04 18:58:29 crc kubenswrapper[4948]: I1204 18:58:29.255876 4948 generic.go:334] "Generic (PLEG): container finished" podID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerID="56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097" exitCode=0 Dec 04 18:58:29 crc kubenswrapper[4948]: I1204 18:58:29.256257 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d92kf" event={"ID":"ce072d47-cfb3-49d7-9cdc-c26b50c9d564","Type":"ContainerDied","Data":"56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097"} Dec 04 18:58:29 crc kubenswrapper[4948]: I1204 18:58:29.256297 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d92kf" event={"ID":"ce072d47-cfb3-49d7-9cdc-c26b50c9d564","Type":"ContainerStarted","Data":"ef6ce3ce292b48b8f79f1224508de368f5c9fcf30537aadb4976cb3736a7e63b"} Dec 04 18:58:30 crc kubenswrapper[4948]: I1204 18:58:30.267021 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d92kf" event={"ID":"ce072d47-cfb3-49d7-9cdc-c26b50c9d564","Type":"ContainerStarted","Data":"4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77"} Dec 04 18:58:31 crc kubenswrapper[4948]: I1204 18:58:31.278298 4948 generic.go:334] "Generic (PLEG): container finished" podID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerID="4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77" exitCode=0 Dec 04 18:58:31 crc kubenswrapper[4948]: I1204 18:58:31.278417 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d92kf" event={"ID":"ce072d47-cfb3-49d7-9cdc-c26b50c9d564","Type":"ContainerDied","Data":"4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77"} Dec 04 18:58:32 crc kubenswrapper[4948]: I1204 18:58:32.292638 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d92kf" event={"ID":"ce072d47-cfb3-49d7-9cdc-c26b50c9d564","Type":"ContainerStarted","Data":"2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241"} Dec 04 18:58:32 crc kubenswrapper[4948]: I1204 18:58:32.321109 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d92kf" podStartSLOduration=1.886521525 podStartE2EDuration="4.321078984s" podCreationTimestamp="2025-12-04 18:58:28 +0000 UTC" firstStartedPulling="2025-12-04 18:58:29.258526879 +0000 UTC m=+5520.619601281" lastFinishedPulling="2025-12-04 18:58:31.693084308 +0000 UTC m=+5523.054158740" observedRunningTime="2025-12-04 18:58:32.316974538 +0000 UTC m=+5523.678048970" watchObservedRunningTime="2025-12-04 18:58:32.321078984 +0000 UTC m=+5523.682153426" Dec 04 18:58:38 crc kubenswrapper[4948]: I1204 18:58:38.452634 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:38 crc kubenswrapper[4948]: I1204 18:58:38.453901 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:39 crc kubenswrapper[4948]: I1204 18:58:39.537753 4948 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d92kf" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="registry-server" probeResult="failure" output=< Dec 04 18:58:39 crc kubenswrapper[4948]: timeout: failed to connect service ":50051" within 1s Dec 04 18:58:39 crc kubenswrapper[4948]: > Dec 04 18:58:40 crc kubenswrapper[4948]: I1204 18:58:40.624928 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:58:40 crc kubenswrapper[4948]: I1204 18:58:40.625000 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:58:48 crc kubenswrapper[4948]: I1204 18:58:48.534707 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:48 crc kubenswrapper[4948]: I1204 18:58:48.610180 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:48 crc kubenswrapper[4948]: I1204 18:58:48.789701 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d92kf"] Dec 04 18:58:50 crc kubenswrapper[4948]: I1204 18:58:50.487405 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d92kf" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="registry-server" containerID="cri-o://2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241" gracePeriod=2 Dec 04 18:58:50 crc kubenswrapper[4948]: I1204 18:58:50.969665 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.132002 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-utilities\") pod \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.132160 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-catalog-content\") pod \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.132216 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwrwx\" (UniqueName: \"kubernetes.io/projected/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-kube-api-access-hwrwx\") pod \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\" (UID: \"ce072d47-cfb3-49d7-9cdc-c26b50c9d564\") " Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.133804 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-utilities" (OuterVolumeSpecName: "utilities") pod "ce072d47-cfb3-49d7-9cdc-c26b50c9d564" (UID: "ce072d47-cfb3-49d7-9cdc-c26b50c9d564"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.143749 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-kube-api-access-hwrwx" (OuterVolumeSpecName: "kube-api-access-hwrwx") pod "ce072d47-cfb3-49d7-9cdc-c26b50c9d564" (UID: "ce072d47-cfb3-49d7-9cdc-c26b50c9d564"). InnerVolumeSpecName "kube-api-access-hwrwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.234517 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.234577 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwrwx\" (UniqueName: \"kubernetes.io/projected/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-kube-api-access-hwrwx\") on node \"crc\" DevicePath \"\"" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.289581 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce072d47-cfb3-49d7-9cdc-c26b50c9d564" (UID: "ce072d47-cfb3-49d7-9cdc-c26b50c9d564"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.336639 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce072d47-cfb3-49d7-9cdc-c26b50c9d564-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.499233 4948 generic.go:334] "Generic (PLEG): container finished" podID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerID="2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241" exitCode=0 Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.499306 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d92kf" event={"ID":"ce072d47-cfb3-49d7-9cdc-c26b50c9d564","Type":"ContainerDied","Data":"2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241"} Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.499359 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d92kf" event={"ID":"ce072d47-cfb3-49d7-9cdc-c26b50c9d564","Type":"ContainerDied","Data":"ef6ce3ce292b48b8f79f1224508de368f5c9fcf30537aadb4976cb3736a7e63b"} Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.499357 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d92kf" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.499468 4948 scope.go:117] "RemoveContainer" containerID="2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.522888 4948 scope.go:117] "RemoveContainer" containerID="4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.553741 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d92kf"] Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.561820 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d92kf"] Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.576537 4948 scope.go:117] "RemoveContainer" containerID="56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.602066 4948 scope.go:117] "RemoveContainer" containerID="2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241" Dec 04 18:58:51 crc kubenswrapper[4948]: E1204 18:58:51.602579 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241\": container with ID starting with 2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241 not found: ID does not exist" containerID="2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.602620 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241"} err="failed to get container status \"2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241\": rpc error: code = NotFound desc = could not find container \"2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241\": container with ID starting with 2469195c0ef158dca2c4baeacd1c02c44d370d33a55a25cd1861a38cb5463241 not found: ID does not exist" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.602657 4948 scope.go:117] "RemoveContainer" containerID="4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77" Dec 04 18:58:51 crc kubenswrapper[4948]: E1204 18:58:51.603020 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77\": container with ID starting with 4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77 not found: ID does not exist" containerID="4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.603078 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77"} err="failed to get container status \"4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77\": rpc error: code = NotFound desc = could not find container \"4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77\": container with ID starting with 4d216244dcd91420ba2107f563510aefdf28a0ee225836d972c7d4ad488ecb77 not found: ID does not exist" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.603101 4948 scope.go:117] "RemoveContainer" containerID="56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097" Dec 04 18:58:51 crc kubenswrapper[4948]: E1204 18:58:51.603485 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097\": container with ID starting with 56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097 not found: ID does not exist" containerID="56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097" Dec 04 18:58:51 crc kubenswrapper[4948]: I1204 18:58:51.603542 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097"} err="failed to get container status \"56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097\": rpc error: code = NotFound desc = could not find container \"56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097\": container with ID starting with 56757d4a83198c566c7e5679630bd746dbb90356d6c884bea2584ff57d611097 not found: ID does not exist" Dec 04 18:58:52 crc kubenswrapper[4948]: I1204 18:58:52.930926 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" path="/var/lib/kubelet/pods/ce072d47-cfb3-49d7-9cdc-c26b50c9d564/volumes" Dec 04 18:59:10 crc kubenswrapper[4948]: I1204 18:59:10.624974 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 18:59:10 crc kubenswrapper[4948]: I1204 18:59:10.625506 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 18:59:10 crc kubenswrapper[4948]: I1204 18:59:10.625551 4948 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" Dec 04 18:59:10 crc kubenswrapper[4948]: I1204 18:59:10.626180 4948 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23d99bd304150fa77813e5ebbfd3e6f62f8172b107cd68932012eabeaf18a1f2"} pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 18:59:10 crc kubenswrapper[4948]: I1204 18:59:10.626228 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" containerID="cri-o://23d99bd304150fa77813e5ebbfd3e6f62f8172b107cd68932012eabeaf18a1f2" gracePeriod=600 Dec 04 18:59:11 crc kubenswrapper[4948]: I1204 18:59:11.697127 4948 generic.go:334] "Generic (PLEG): container finished" podID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerID="23d99bd304150fa77813e5ebbfd3e6f62f8172b107cd68932012eabeaf18a1f2" exitCode=0 Dec 04 18:59:11 crc kubenswrapper[4948]: I1204 18:59:11.697497 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerDied","Data":"23d99bd304150fa77813e5ebbfd3e6f62f8172b107cd68932012eabeaf18a1f2"} Dec 04 18:59:11 crc kubenswrapper[4948]: I1204 18:59:11.697538 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" event={"ID":"9c5bb3e4-2f5a-47d7-a998-be50d1429cb2","Type":"ContainerStarted","Data":"57d1527e601fce83415514956790f3132a8084e02f716ca9d43fbedf8bf29be6"} Dec 04 18:59:11 crc kubenswrapper[4948]: I1204 18:59:11.697566 4948 scope.go:117] "RemoveContainer" containerID="8311b7104deed4d7bf331930390f75ccd0a7ff780c49bdbbfe7e9daa10bfa250" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.162942 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt"] Dec 04 19:00:00 crc kubenswrapper[4948]: E1204 19:00:00.163836 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="registry-server" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.163853 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="registry-server" Dec 04 19:00:00 crc kubenswrapper[4948]: E1204 19:00:00.163877 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="extract-utilities" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.163885 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="extract-utilities" Dec 04 19:00:00 crc kubenswrapper[4948]: E1204 19:00:00.163920 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="extract-content" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.163929 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="extract-content" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.164140 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce072d47-cfb3-49d7-9cdc-c26b50c9d564" containerName="registry-server" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.164842 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.168041 4948 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.170918 4948 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.177409 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt"] Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.216095 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqlqc\" (UniqueName: \"kubernetes.io/projected/190e7278-5ff9-456e-ac95-2ccdc1f3a218-kube-api-access-xqlqc\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.216210 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190e7278-5ff9-456e-ac95-2ccdc1f3a218-config-volume\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.216380 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190e7278-5ff9-456e-ac95-2ccdc1f3a218-secret-volume\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.318326 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqlqc\" (UniqueName: \"kubernetes.io/projected/190e7278-5ff9-456e-ac95-2ccdc1f3a218-kube-api-access-xqlqc\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.318433 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190e7278-5ff9-456e-ac95-2ccdc1f3a218-config-volume\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.318498 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190e7278-5ff9-456e-ac95-2ccdc1f3a218-secret-volume\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.320290 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190e7278-5ff9-456e-ac95-2ccdc1f3a218-config-volume\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.334827 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190e7278-5ff9-456e-ac95-2ccdc1f3a218-secret-volume\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.348367 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqlqc\" (UniqueName: \"kubernetes.io/projected/190e7278-5ff9-456e-ac95-2ccdc1f3a218-kube-api-access-xqlqc\") pod \"collect-profiles-29414580-crrqt\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.517900 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:00 crc kubenswrapper[4948]: I1204 19:00:00.815015 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt"] Dec 04 19:00:01 crc kubenswrapper[4948]: I1204 19:00:01.284310 4948 generic.go:334] "Generic (PLEG): container finished" podID="190e7278-5ff9-456e-ac95-2ccdc1f3a218" containerID="03fcc0e7208230d2181bc3c1f007d804cf22ba5567f34c5fcc295ebe32d63ae3" exitCode=0 Dec 04 19:00:01 crc kubenswrapper[4948]: I1204 19:00:01.284407 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" event={"ID":"190e7278-5ff9-456e-ac95-2ccdc1f3a218","Type":"ContainerDied","Data":"03fcc0e7208230d2181bc3c1f007d804cf22ba5567f34c5fcc295ebe32d63ae3"} Dec 04 19:00:01 crc kubenswrapper[4948]: I1204 19:00:01.284848 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" event={"ID":"190e7278-5ff9-456e-ac95-2ccdc1f3a218","Type":"ContainerStarted","Data":"b514fa0f4f31f8a39469a34f16b9edd4bd51057e421a0f0aeb2579f6aad3b58c"} Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.675287 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.859479 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190e7278-5ff9-456e-ac95-2ccdc1f3a218-secret-volume\") pod \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.859668 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqlqc\" (UniqueName: \"kubernetes.io/projected/190e7278-5ff9-456e-ac95-2ccdc1f3a218-kube-api-access-xqlqc\") pod \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.859769 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190e7278-5ff9-456e-ac95-2ccdc1f3a218-config-volume\") pod \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\" (UID: \"190e7278-5ff9-456e-ac95-2ccdc1f3a218\") " Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.861068 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/190e7278-5ff9-456e-ac95-2ccdc1f3a218-config-volume" (OuterVolumeSpecName: "config-volume") pod "190e7278-5ff9-456e-ac95-2ccdc1f3a218" (UID: "190e7278-5ff9-456e-ac95-2ccdc1f3a218"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.866481 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/190e7278-5ff9-456e-ac95-2ccdc1f3a218-kube-api-access-xqlqc" (OuterVolumeSpecName: "kube-api-access-xqlqc") pod "190e7278-5ff9-456e-ac95-2ccdc1f3a218" (UID: "190e7278-5ff9-456e-ac95-2ccdc1f3a218"). InnerVolumeSpecName "kube-api-access-xqlqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.870628 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/190e7278-5ff9-456e-ac95-2ccdc1f3a218-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "190e7278-5ff9-456e-ac95-2ccdc1f3a218" (UID: "190e7278-5ff9-456e-ac95-2ccdc1f3a218"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.962129 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqlqc\" (UniqueName: \"kubernetes.io/projected/190e7278-5ff9-456e-ac95-2ccdc1f3a218-kube-api-access-xqlqc\") on node \"crc\" DevicePath \"\"" Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.962187 4948 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/190e7278-5ff9-456e-ac95-2ccdc1f3a218-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 19:00:02 crc kubenswrapper[4948]: I1204 19:00:02.962209 4948 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/190e7278-5ff9-456e-ac95-2ccdc1f3a218-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 19:00:03 crc kubenswrapper[4948]: I1204 19:00:03.306231 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" event={"ID":"190e7278-5ff9-456e-ac95-2ccdc1f3a218","Type":"ContainerDied","Data":"b514fa0f4f31f8a39469a34f16b9edd4bd51057e421a0f0aeb2579f6aad3b58c"} Dec 04 19:00:03 crc kubenswrapper[4948]: I1204 19:00:03.306290 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414580-crrqt" Dec 04 19:00:03 crc kubenswrapper[4948]: I1204 19:00:03.306294 4948 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b514fa0f4f31f8a39469a34f16b9edd4bd51057e421a0f0aeb2579f6aad3b58c" Dec 04 19:00:03 crc kubenswrapper[4948]: I1204 19:00:03.746691 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww"] Dec 04 19:00:03 crc kubenswrapper[4948]: I1204 19:00:03.751004 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414535-lk7ww"] Dec 04 19:00:04 crc kubenswrapper[4948]: I1204 19:00:04.924935 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0039de0-35fb-4079-81a8-d502073646f7" path="/var/lib/kubelet/pods/b0039de0-35fb-4079-81a8-d502073646f7/volumes" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.263882 4948 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5t24x"] Dec 04 19:00:33 crc kubenswrapper[4948]: E1204 19:00:33.264963 4948 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="190e7278-5ff9-456e-ac95-2ccdc1f3a218" containerName="collect-profiles" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.264987 4948 state_mem.go:107] "Deleted CPUSet assignment" podUID="190e7278-5ff9-456e-ac95-2ccdc1f3a218" containerName="collect-profiles" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.265335 4948 memory_manager.go:354] "RemoveStaleState removing state" podUID="190e7278-5ff9-456e-ac95-2ccdc1f3a218" containerName="collect-profiles" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.267959 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.282142 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t24x"] Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.368177 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzsm4\" (UniqueName: \"kubernetes.io/projected/202a0460-472d-4671-96fb-e35fb2042ae3-kube-api-access-jzsm4\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.368276 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-utilities\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.368329 4948 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-catalog-content\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.470064 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-catalog-content\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.470217 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzsm4\" (UniqueName: \"kubernetes.io/projected/202a0460-472d-4671-96fb-e35fb2042ae3-kube-api-access-jzsm4\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.470329 4948 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-utilities\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.470754 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-catalog-content\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.470861 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-utilities\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.493305 4948 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzsm4\" (UniqueName: \"kubernetes.io/projected/202a0460-472d-4671-96fb-e35fb2042ae3-kube-api-access-jzsm4\") pod \"community-operators-5t24x\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:33 crc kubenswrapper[4948]: I1204 19:00:33.603806 4948 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:34 crc kubenswrapper[4948]: I1204 19:00:34.125715 4948 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t24x"] Dec 04 19:00:34 crc kubenswrapper[4948]: I1204 19:00:34.599943 4948 generic.go:334] "Generic (PLEG): container finished" podID="202a0460-472d-4671-96fb-e35fb2042ae3" containerID="648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d" exitCode=0 Dec 04 19:00:34 crc kubenswrapper[4948]: I1204 19:00:34.600266 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t24x" event={"ID":"202a0460-472d-4671-96fb-e35fb2042ae3","Type":"ContainerDied","Data":"648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d"} Dec 04 19:00:34 crc kubenswrapper[4948]: I1204 19:00:34.600296 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t24x" event={"ID":"202a0460-472d-4671-96fb-e35fb2042ae3","Type":"ContainerStarted","Data":"579abef5abefd9cb1e4667f9fb1520404311fc1fc56c965d59c4bb7896e9415f"} Dec 04 19:00:34 crc kubenswrapper[4948]: I1204 19:00:34.603889 4948 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 19:00:35 crc kubenswrapper[4948]: I1204 19:00:35.609970 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t24x" event={"ID":"202a0460-472d-4671-96fb-e35fb2042ae3","Type":"ContainerStarted","Data":"15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5"} Dec 04 19:00:36 crc kubenswrapper[4948]: I1204 19:00:36.621346 4948 generic.go:334] "Generic (PLEG): container finished" podID="202a0460-472d-4671-96fb-e35fb2042ae3" containerID="15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5" exitCode=0 Dec 04 19:00:36 crc kubenswrapper[4948]: I1204 19:00:36.621408 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t24x" event={"ID":"202a0460-472d-4671-96fb-e35fb2042ae3","Type":"ContainerDied","Data":"15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5"} Dec 04 19:00:37 crc kubenswrapper[4948]: I1204 19:00:37.629149 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t24x" event={"ID":"202a0460-472d-4671-96fb-e35fb2042ae3","Type":"ContainerStarted","Data":"de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514"} Dec 04 19:00:37 crc kubenswrapper[4948]: I1204 19:00:37.655573 4948 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5t24x" podStartSLOduration=2.262502471 podStartE2EDuration="4.65554685s" podCreationTimestamp="2025-12-04 19:00:33 +0000 UTC" firstStartedPulling="2025-12-04 19:00:34.603530122 +0000 UTC m=+5645.964604544" lastFinishedPulling="2025-12-04 19:00:36.996574521 +0000 UTC m=+5648.357648923" observedRunningTime="2025-12-04 19:00:37.645687182 +0000 UTC m=+5649.006761584" watchObservedRunningTime="2025-12-04 19:00:37.65554685 +0000 UTC m=+5649.016621292" Dec 04 19:00:42 crc kubenswrapper[4948]: I1204 19:00:42.601353 4948 scope.go:117] "RemoveContainer" containerID="8906b592fcd969d924a5bab5f2cc38953829faef436db60819607aa46fc0a746" Dec 04 19:00:43 crc kubenswrapper[4948]: I1204 19:00:43.604938 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:43 crc kubenswrapper[4948]: I1204 19:00:43.605014 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:43 crc kubenswrapper[4948]: I1204 19:00:43.667888 4948 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:43 crc kubenswrapper[4948]: I1204 19:00:43.747026 4948 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:43 crc kubenswrapper[4948]: I1204 19:00:43.921589 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t24x"] Dec 04 19:00:45 crc kubenswrapper[4948]: I1204 19:00:45.700764 4948 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5t24x" podUID="202a0460-472d-4671-96fb-e35fb2042ae3" containerName="registry-server" containerID="cri-o://de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514" gracePeriod=2 Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.166616 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.303808 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-utilities\") pod \"202a0460-472d-4671-96fb-e35fb2042ae3\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.304284 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-catalog-content\") pod \"202a0460-472d-4671-96fb-e35fb2042ae3\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.304413 4948 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzsm4\" (UniqueName: \"kubernetes.io/projected/202a0460-472d-4671-96fb-e35fb2042ae3-kube-api-access-jzsm4\") pod \"202a0460-472d-4671-96fb-e35fb2042ae3\" (UID: \"202a0460-472d-4671-96fb-e35fb2042ae3\") " Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.305307 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-utilities" (OuterVolumeSpecName: "utilities") pod "202a0460-472d-4671-96fb-e35fb2042ae3" (UID: "202a0460-472d-4671-96fb-e35fb2042ae3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.318633 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202a0460-472d-4671-96fb-e35fb2042ae3-kube-api-access-jzsm4" (OuterVolumeSpecName: "kube-api-access-jzsm4") pod "202a0460-472d-4671-96fb-e35fb2042ae3" (UID: "202a0460-472d-4671-96fb-e35fb2042ae3"). InnerVolumeSpecName "kube-api-access-jzsm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.361772 4948 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "202a0460-472d-4671-96fb-e35fb2042ae3" (UID: "202a0460-472d-4671-96fb-e35fb2042ae3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.405896 4948 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.405924 4948 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzsm4\" (UniqueName: \"kubernetes.io/projected/202a0460-472d-4671-96fb-e35fb2042ae3-kube-api-access-jzsm4\") on node \"crc\" DevicePath \"\"" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.405939 4948 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/202a0460-472d-4671-96fb-e35fb2042ae3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.713455 4948 generic.go:334] "Generic (PLEG): container finished" podID="202a0460-472d-4671-96fb-e35fb2042ae3" containerID="de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514" exitCode=0 Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.713540 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t24x" event={"ID":"202a0460-472d-4671-96fb-e35fb2042ae3","Type":"ContainerDied","Data":"de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514"} Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.713587 4948 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t24x" event={"ID":"202a0460-472d-4671-96fb-e35fb2042ae3","Type":"ContainerDied","Data":"579abef5abefd9cb1e4667f9fb1520404311fc1fc56c965d59c4bb7896e9415f"} Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.713598 4948 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t24x" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.713648 4948 scope.go:117] "RemoveContainer" containerID="de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.742625 4948 scope.go:117] "RemoveContainer" containerID="15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.784812 4948 scope.go:117] "RemoveContainer" containerID="648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.787932 4948 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t24x"] Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.805389 4948 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5t24x"] Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.820712 4948 scope.go:117] "RemoveContainer" containerID="de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514" Dec 04 19:00:46 crc kubenswrapper[4948]: E1204 19:00:46.823435 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514\": container with ID starting with de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514 not found: ID does not exist" containerID="de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.823494 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514"} err="failed to get container status \"de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514\": rpc error: code = NotFound desc = could not find container \"de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514\": container with ID starting with de6d8c0bb486612646a08b4f93db2688f9c189a5004f28522f0c7a6f51d1b514 not found: ID does not exist" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.823531 4948 scope.go:117] "RemoveContainer" containerID="15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5" Dec 04 19:00:46 crc kubenswrapper[4948]: E1204 19:00:46.824070 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5\": container with ID starting with 15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5 not found: ID does not exist" containerID="15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.824104 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5"} err="failed to get container status \"15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5\": rpc error: code = NotFound desc = could not find container \"15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5\": container with ID starting with 15862a75ceec215689fdbf2cbf30de63f601485c358d68bdd1faab654a97e9e5 not found: ID does not exist" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.824126 4948 scope.go:117] "RemoveContainer" containerID="648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d" Dec 04 19:00:46 crc kubenswrapper[4948]: E1204 19:00:46.824476 4948 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d\": container with ID starting with 648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d not found: ID does not exist" containerID="648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.824515 4948 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d"} err="failed to get container status \"648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d\": rpc error: code = NotFound desc = could not find container \"648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d\": container with ID starting with 648315b8180ec05e69461ccaab4a2048b81b7b99ec8ba9eb9827a9b9a19c6d8d not found: ID does not exist" Dec 04 19:00:46 crc kubenswrapper[4948]: I1204 19:00:46.924559 4948 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="202a0460-472d-4671-96fb-e35fb2042ae3" path="/var/lib/kubelet/pods/202a0460-472d-4671-96fb-e35fb2042ae3/volumes" Dec 04 19:01:10 crc kubenswrapper[4948]: I1204 19:01:10.624997 4948 patch_prober.go:28] interesting pod/machine-config-daemon-hfvn4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 19:01:10 crc kubenswrapper[4948]: I1204 19:01:10.625545 4948 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hfvn4" podUID="9c5bb3e4-2f5a-47d7-a998-be50d1429cb2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"